Google Cloud and AWS launch new services on machine learning and containers respectively

Another day, another product launch in the land of the hyperscalers – and for Google Cloud and Amazon Web Services (AWS), their new services are focusing on machine learning (ML) and containers respectively.

Google’s launch of Cloud AI Platform Pipelines, in beta, aims to provide a way to deploy ‘robust, repeatable machine learning pipelines… and delivers an enterprise-ready, easy to install, secure execution environment for ML workflows.’

This can be seen, for Google Cloud’s customers, as a potential maturation of their machine learning initiatives. “When you’re just prototyping a machine learning model in a notebook, it can seem fairly straightforward,” the company notes, in a blog post authored by product manager Anusha Ramesh and developer advocate Amy Unruh. “But when you need to start paying attention to the other pieces required to make a ML workflow sustainable and scalable, things become more complex.

“A machine learning workflow can involve many steps with dependencies on each other, from data preparation and analysis, to training, to evaluation, to deployment, and more,” they added. “It’s hard to compose and track these processes in an ad-hoc manner – for example, in a set of notebooks or scripts – and things like auditing and reproducibility become increasingly problematic.”

The solution will naturally integrate seamlessly with Google Cloud’s various managed services, such as BigQuery, stream and batch processing service Dataflow, and serverless platform Cloud Functions, the company promises. The move comes at an interesting time given Google’s ranking in Gartner’s most recent Magic Quadrant for cloud AI developer services; placed as a leader, alongside IBM, Microsoft and Amazon Web Services (AWS), but just behind the latter two, with AWS on top.

AWS, meanwhile, has launched Bottlerocket, an open source operating system designed and optimised specifically for hosting containers. The company notes the importance of containers to package and scale applications for its customers, with chief evangelist Jeff Barr noting in a blog post that more than four in five cloud-based containers are running on Amazon’s cloud.

Bottlerocket aims to solve some of the challenges around container rollouts, using an image-based model instead of a package update system to enable a quick rollback and potentially avoid breakages. Like other aspects of cloud security, surveys have shown that container security snafus are caused frequently by human error. In a recent report StackRox said misconfigured containers were ‘alarmingly common’ as a root cause.

Barr noted security – in this case installing extra packages and increasing the attack surface – was a problem Bottlerocket aimed to remediate, alongside updates, increasing overheads, and inconsistent configurations.

“Bottlerocket reflects much of what we have learned over the year,” wrote Barr. “It includes only the packages that are needed to make it a great container host, and integrates with existing container orchestrators.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Bottlerocket is Amazon’s new purpose-built OS for running containers


Dale Walker

11 Mar, 2020

Amazon Web Services has unveiled a free open source operating system called Bottlerocket, designed specifically to run containers on bare metal or virtual machines.

Bottlerocket is being pitched as a purpose-built operating system designed with a single-step process to make it far easier to automate updates, while also cutting out much of the unnecessary elements found in general-purpose software.

Its biggest selling aspect is its dual partition setup, running as active and inactive. When an update is issued, the inactive side is changed first, with the system then switching the positions of the partitions in order to complete the update.

The OS also uses image-based updates, which means the update can be rolled back in its entirety if necessary, helping to reduce downtime and minimise process failure. This is in contrast to most general-purpose operating systems which use a package-by-package approach.

As part of the slimmed-down design, Bottlerock takes a different approach to authentication and secure login normally found on general-purpose systems. There’s no SSH server to support secure logins, although users can use a separate container to access admin controls.

The new OS also supports all the container tools you might expect, including Docker images and anything conforming with the Open Container Initiative standard. The system is also built with some third-party components, including the Linux kernel, the container’s runtime, and Kubernetes. 

The OS is currently in a preview state, and is hosted on GitHub alongside a host of tools and documentation to support its use. Among these is a Bottlerocket Charter, which claims that the OS is open and “not a Kubernetes distro, nor an Amazon distro”, adding that such a platform can only be built with the support of a wider community.

Despite its open nature, the OS is optimised to work best with AWS tools out of the box, specifically Amazon’s Elastic Kubernetes Service (EKS).

The OS is currently available in a free public preview as an Amazon Machine Image (AMI) for Elastic Cloud Compute. Once released to general availability, Bottlerocket will come with three years of support, incorporated into AWS support plans at no extra cost.

Cloud complexity and ‘terrifying’ IoT means organisations’ asset visibility is worsening – report

As security best practice continues to be a battle between organisations closing the gap of hackers who stay one step ahead, a new report from cybersecurity asset management provider Axonius has argued the complexity of cloud infrastructure means companies are ‘rapidly’ losing sight of their asset landscape.

The study, put together by Enterprise Strategy Group (ESG) and which polled 200 North America-based IT and cybersecurity professionals, found that for respondents overall, more than half (52%) of VMs now reside in the cloud and running in multiple environments.

The report describes cloud visibility as ‘hazy at best’, with more than two thirds (69%) of those polled admitting they have a cloud visibility gap. Three quarters of those polled said they had experienced several serious cloud VM security incidents. Adding to this mix is a rise in container usage, with plenty of research reports previously noting dire consequences may be afoot if the spike was not adequately secured. Axonius describes container uptake as ‘mainstream’ today.

Internet of Things (IoT) projects are gaining steam yet an even wider visibility gap remains – 77% of respondents report a disparity. The report describes this trend as ‘inevitable or terrifying’; four in five (81%) say IoT devices will outnumber all other devices within three years, while more than half (58%) admit diversity in devices was their biggest management headache.

Bring your own device (BYOD) is still a sticking point for many companies, even if the hype and coverage has since died down. Almost half (49%) of organisations polled said they prohibit BYOD for work-related activities, while three in five (61%) of those who have policies in place are worried about violations. “BYOD looks to be here to stay… even if security suspects that policies are being circumvented,” the report notes.

Part of the solution is also part of the problem. Organisations are using on average more than 100 separate security tools, making the already-complicated task of IT asset management even more fiendish. A new approach is needed, the report warns: IT asset inventories currently demand the involvement of multiple teams, and take more than two weeks of effort.

“When we speak with customers from the midmarket up to the Fortune 100, we hear the same challenges: teams are faced with too many assets, a patchwork of security tools, and maddeningly manual processes to understand what is there and whether those assets are secure,” said Dean Sysman, CEO and co-founder at Axonius. “This survey uncovers the depth and breadth of the asset management challenges we see today and what’s on the horizon.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

VMware launches vSphere 7 and Tanzu container management tools


Adam Shepherd

10 Mar, 2020

VMware has announced the launch of a number of new Kubernetes-focused products, including the latest version of its vSphere platform.

Most of the new products fall under the company’s Tanzu portfolio, unveiled at last year’s VMworld

Tanzu represents VMware’s efforts to integrate Kubernetes container management – which the company is betting big on as the next significant step in enterprise applications – with its existing VM management tools. 

Three new Tanzu products are being introduced; first up, Tanzu Mission Control, a tool previewed as part of last year’s announcement which is designed to help enterprises manage multiple Kubernetes clusters across a range of environments, while centralising key functions like security, configuration management and data protection. It also allows businesses to hook VMware’s other management and monitoring tools (such as its Wavefront and CloudHealth products) into its Kubernetes workloads.

Following on from this is VMware’s new Tanzu Application Catalogue, which represents a way for customers to integrate open source components and tools from Bitnami’s catalogue into their applications in a safe and secure way, by providing a curated repository of open source products that have been verified as stable and vulnerability-free.

For organisations at the start of their container projects, Tanzu Kubernetes Grid is being introduced as a ubiquitous container runtime, combining open source Kubernetes tooling, container images and registry and lifecycle management. Described by VMware as an evolution of its strategy with Enterprise PKS (which will remain as a separate offering), the aim is to make it easier to quickly start using Kubernetes in a consistent way across multiple environments, alongside existing VMware deployments.

Speaking of which, a new version of VMware’s flagship vSphere suite is also being introduced, and it includes a range of Kubernetes-friendly features. Previously teased as Project Pacific, vSphere 7 has been ‘fundamentally modernised’, according to VMware, and re-architected to put Kubernetes management at its heart.

VMs are not being forgotten about, of course – the goal is to enable VMware admins to easily run containers and VMs concurrently. vSphere 7 has also been optimised for simplified lifecycle management, allowing enterprises to manage hundreds or thousands of instances in less time, with fewer tools, including introducing REST and JSON-based APIs for automating lifecycle management tasks.

vSphere 7 also introduces greater security through remote attestation, where a trusted host is used to verify the integrity of other hosts within the network. Elsewhere, vMotion has been improved to allow for easier migration of large VMs with minimal disruption and the Distributed Resource Scheduler now runs every minute as opposed to every five minutes.

GPU virtualisation is now on offer too, thanks to the company’s recent acquisition Bitfusion, which is being touted as a particular benefit for those looking to run workloads using machine learning.

The aforementioned tools are also being rolled into VMware Cloud Foundation 4 with Tanzu, which includes vSAN 7 for managing virtualised storage.

“Kubernetes is still hard,” VMware CEO Pat Gelsinger said. “We’re democratising Kubernetes into the industry, with the most powerful platform, the most powerful infrastructure community across multiple clouds; This for us is an important day, not just for us, but for our customers, and for the industry.”

VMware Tanzu Application Catalog, Tanzu Mission Control and Tanzu Kubernetes Grid are available now, while VMware Cloud Foundation 4 and VMware vSphere 7 are scheduled to be available by the start of May.

HPE adds ‘5G as a service’ suite to GreenLake portfolio


Keumars Afifi-Sabet

10 Mar, 2020

HPE has added a suite of ‘as a service’ networking capabilities to its GreenLake portfolio designed to give customers the tools to accelerate 5G deployment.

The HP spinoff is aiming to extend its reach with telecoms firms and enterprises with its new 5G tools, which are designed to enhance existing 5G networks and ramp up the scale of infrastructure rollout.

For example, the company’s cloud-native and container-based software platform, dubbed 5G Core Stack, will provide telecoms firms with 5G tech at the core of their mobile networks.

This is to ensure that networks are embedded with 5G technology at their hearts, as well as on the edge, termed standalone 5G networks. This is against non-standalone 5G networks (5G networks running on 4G cores) which is how many operators run their networks today.

HPE hopes that mobile network operators and virtual network operators can adopt the technology at the core and the edge, and repackage these platforms to serve their own enterprise customers.

This is in addition to technology from HPE subsidiary Aruba, which has been used to launch services geared towards raising interoperability and integration between 5G and Wi-Fi 6 networks. These services are dubbed Air Slice and Air Pass.

“Openness is essential to the evolutionary nature of 5G and with HPE 5G Core Stack telcos can reduce operational costs, deploy features faster and keep themselves open to multiple networks and technologies while avoiding being locked-in to a single vendor approach,” said Phil Mottram, VP and GM for HPE’s communications and media solutions division.

“HPE has one of the broadest 5G portfolios in the market and is uniquely positioned to help telcos build an open multi-vendor 5G core, optimise the edge with vRAN, and deliver connectivity and new compute services to the enterprise using MEC and Wi-Fi 6.”

The as a service suite is all-encompassing and includes both hardware and supporting software, as well as cloud-native 5G functions such as Air Slice and Air Pass.

While HPE initially hopes to complement technology offered by the likes of networking giants Huawei and Ericsson, Mottram conceded that smaller customers may opt to replace all services offered by these firms with HPE technology.

This bold move to give network operators and enterprises an alternative has been made possible due to the open nature of 5G standards.

These standards were devised to break the stranglehold that existed previously, and allow enterprises to effectively mix and match elements of their 5G infrastructure in a way that wasn’t possible with previous generations of technology like 3G and 4G.

The technology, which includes the underlying 5G infrastructure, as well as support software at both the core and the edge, will be made available to customers on a consumption-based model through the company’s GreenLake portfolio.

With 50 conversations currently underway with prospective customers, HPE expects larger enterprises to adopt a sample of the 5G as a service portfolio, while smaller firms are more likely to adopt the full capabilities on an end-to-end basis.

In terms of cloud-native 5G functions that can work on top of the underlying 5G infrastructure, meanwhile, HPE launched a couple of enterprise-oriented services powered by Aruba’s 16.5 million hotspots.

Air Slice, for example, allows customers to carve up their networks into segments with dedicated functions to avoid crosstalk. Air Pass, meanwhile, gives individuals the capacity to join Wi-Fi networks without having to manually enter credentials.

Users’ identities are verified using their ties with another entity, such as a bank or a mobile network, in a similar way to one-click social media logins, used by Facebook or Sign in with Apple.

“As part of the foundational capability we’re talking about here is a shared data environment, so having a data model you can utilise across different functions in the capabilities and sharing it across different functions,” HPE’s chief technologist for communications and media, Chris Dando told IT Pro.

He added HPE was looking at how the end experience could be made seamless, with individuals not just tied to a physical SIM or looked on as being a phone number, but retaining their individuality. This is just one aspect of the suite of cloud-native 5G functionality the firm is hoping to build out.

“Those sorts of things are where we’ve led the way and are taking that too the next layer with regards to building out some of these core capabilities,” he continued.

“That’s going to become more and more important if you look to add on more device types for different use cases, as you get into IoT, and being able to identify groups of things as being part of a particular workflow or enterprise-type environment.”

Cisco, incidentally, outlined a similar Wi-Fi hopping technology at its flagship Cisco Live 2020 conference in January, which has already been deployed at the Fira de Barcelona.

It was expected to allow visitors to Mobile World Congress (MWC) hop seamlessly between 4G and 5G networks and the venue’s Wi-Fi networks, before the event was cancelled due to the global coronavirus outbreak.

Lloyds Banking Group signs up to Google Cloud in five-year partnership

Google Cloud continues to secure the big-ticket clients, with the company announcing that Lloyds Banking Group is set to embark on its ‘cloud transformation’ journey with Google.

The bank will invest a total of £3 billion ($3.9bn) in a five-year deal which will see Lloyds deploy various Google Cloud services focused around the customer experience. Google Cloud will also ensure that Lloyds engineers are trained to ‘enhance disciplines… all in an effort to boost efficiency and offer innovative new services to the bank’s retail and commercial customers.’

“The size of our digital transformation is huge and Google Cloud’s capabilities will help drive this forward, increasing the pace of innovation, as well as bringing new services to our customers quickly and at scale,” said Zaka Mian, group transformation director at Lloyds in a statement. “This collaboration gives us a strategic advantage to continue as a leader in banking technology.”

Alongside retail and healthcare, financial services is one of the three primary customer target areas for Google. HSBC is arguably the best-known financial customer, with the company speaking at Google Next as far back as 2017. In September Srinivas Vaddadi, delivery head for data services engineering at HSBC, elaborated on the bank’s ongoing process of moving its legacy data warehouse onto BigQuery. Other Google Cloud financial services customers include PayPal, Ravelin, and Charles Schwab.

Recent customer wins include Major League Baseball, who is discontinuing its relationship with Amazon Web Services as a result, and Hedera Hashgraph.

“Banking customers today expect secure access to their funds, without downtime, and delivered through the modern experiences they receive in other aspects of their lives,” said Google Cloud CEO Thomas Kurian. “We are proud to work with such a storied institution as Lloyds, which helped to create – and continues to redefine – the next generation of financial services.”

Picture credit: Lloyds Branch Manchester Exterior, by Money Bright, used under CC BY 2.0

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Google Cloud joins Lloyd’s digital transformation project


Bobby Hellard

10 Mar, 2020

Lloyds Banking Group has agreed on a five-year partnership with Google Cloud to further push its digital transformation work.

The deal comes under Lloyd’s £3 billion internal investment to upgrade its IT systems to compete in the increasingly digitised world of finance.

The UK-based bank, which serves 26 million customers, will collaborate with Google Cloud to help drive its overall cloud transformation programme and streamline its operations. The five-year “strategic collaboration” will see Lloyds use a number of Google Cloud services to help modernise and improve its customer experience.

This will also include Google Cloud experts on-site to enhance engineering processes, all in an effort to boost efficiency and offer innovative new services to the bank’s retail and commercial customers.

“Banking customers today expect secure access to their funds, without downtime, and delivered through the modern experiences they receive in other aspects of their lives,” says Thomas Kurian, Google Cloud’s CEO.

“We are proud to work with such a storied institution as Lloyds, which helped to create – and continues to redefine – the next generation of financial services.”

In 2018, Lloyds announced a partnership with Thought Machine, investing £11 million in upgrading its IT infrastructure. That came with announcements of 6,000 jobs cuts with a promise to create 8,000 more digital ones in the near future.

“As the UK’s largest digital bank we have made strong progress in transforming not only our systems but also how we work,” said Zaka Mian, group transformation director, Lloyds Banking Group.

“The size of our digital transformation is huge and Google Cloud’s capabilities will help drive this forward, increasing the pace of innovation, as well as bringing new services to our customers quickly and at scale.”

AWS “likely to succeed” in JEDI appeal


Bobby Hellard

9 Mar, 2020

The Department of Defence (DoD) improperly evaluated a Microsoft storage price scenario, the US judge that halted work on the JEDI contract has said.

Federal Claims Judge Patricia Cambell-Smith added that because of this AWS is “likely to succeed” in its court challenge, according to Reuters.

The Joint Enterprise Defence Infrastructure project, awarded to Microsoft in October, is a $10 billion project to migrate the DoD’s computer systems. It has been marred in controversy from the start, with companies like IBM and Oracle attempting legal action over its single-vendor nature.

A large part of AWS’ complaint was that the bidding process was unfairly influenced by US President Donald Trump, but that was not mentioned in the Judge’s opinion.

Campbell-Smith wrote that Amazon “is likely to succeed on the merits of its argument that the DoD improperly evaluated” a Microsoft price scenario. She added that the scenario assessed by the Pentagon was not “technically feasible”.

In response, Microsoft dismissed the decision as it focused on a “lone technical finding” of data storage under one price scenario out of six, according to Reuters. The company is confident it would eventually be able to move on with the project.

“The government makes clear that in their view Microsoft’s solution met the technical standards and performed as needed,” a Microsoft spokesperson said.

This could be seen as the first real tech-related argument of the JEDI saga, with large sections of the lawsuit focusing on the alleged interference of US President Donald Trump, who reportedly became concerned with the bidding when other cloud providers complained of being unfairly excluded.

Trump also has a long-running feud with Amazon CEO Jeff Bezos, which is seen as a possible reason for his alleged involvement. Former Pentagon Secretary James Mattis is thought to have said that Trump directed him to “screw Amazon” out of a chance to bid on the contract.

Panda Security to be acquired by WatchGuard


Keumars Afifi-Sabet

9 Mar, 2020

WatchGuard Technologies has agreed to purchase Spanish antivirus software developer Panda Security and integrate its systems into its own security platform to service customers and partners of both companies.

WatchGuard aims to provide a complete cyber security portfolio of products and services for its customers, and will, with time, integrate Panda’s technology into its own systems. This is especially true for the firm’s user-centric threat detection and response tools.

Panda launched in 1998 as a security firm specialising in developing IT security services, predominantly antivirus software. Its popular Free Antivirus package has long been a favourite for home-based online security, although Pana has expanded its range of services to offer tools for businesses, as well as technology for preventing cyber crime

Beyond providing user-focused security services, WatchGuard is known for producing secure Wi-Fi appliances, such as the recently-reviewed Firebox M670

“Businesses today face an increasingly sophisticated and evolving threat landscape, scarcity of trained security professionals, and an increasingly porous perimeter,” said WatchGuard’s CEO Prakash Panjwani.

“As a result, network security, advanced endpoint protection, multi-factor authentication, secure networking, and threat detection and response capabilities are consistently ranked as top security investment areas by IT decision-makers and IT solution providers who serve them.

“By bringing the companies together, we enable our current and future customers and partners to consolidate their fundamental security services under a single brand, backed by the innovation and quality that is a core part of both companies’ DNA.”

Among Panda’s most highly-prized assets is the patented technology TruPrevent, which amounts to proactive capabilities geared towards blocking unknown malware. This is in addition to the Collective Intelligence model it deploys to detect, analyse, and classify viruses in real-time.

The tools Panda is feeding into WatchGuard’s platforms are powered by a combination of automation, AI processes, and analyst-led investigation. The company also recently launched a sophisticated threat hunting service available for enterprises, as well as managed security service providers (MSSPs) who resell Panda services.

“We are thrilled to merge with WatchGuard because of the new scale and portfolio access it provides to Panda Security customers and partners,” said Panda’s CEO Juan Santamaria.

“We are also excited to see our innovative product portfolio be delivered via WatchGuard’s strong global network of partners. Together, we look forward to building a security platform that bridges the network and user perimeter, with capabilities that are unmatched in the cybersecurity market.”

The agreement, the value of which has been undisclosed, is likely to close in the second quarter of 2020.

Gartner’s cloud AI developer services Magic Quadrant sees AWS edge out Microsoft and Google

The hyperscalers continue to lead where others fear to tread – or at least those without the cavernous pockets of the biggest cloud players, at least. Gartner has gazed into its crystal ball and ranked the main vendors for cloud AI developer services, and it’s a 1-2-3 for Amazon Web Services (AWS), Microsoft and Google.

In its latest Magic Quadrant, Gartner defines ‘cloud AI developer services’ thus: as “cloud-hosted services/models that allow development teams to leverage AI models via APIs without requiring deep data science expertise…deliver[ing] services with capabilities in language, vision, and automated machine learning.” In other words, companies who can provide a one-stop-shop for applications and models across the stack are likely to fare well here.

AWS nudged out Microsoft and Google in terms of ability to execute and completeness of vision, but as can be evinced from the ranking the three leaders are tightly bunched. IBM also placed in the leaders’ ranking, with 10 vendors assessed in total.

At last count – or rather, what CEO Andy Jassy told re:Invent attendees in December – AWS had more than 175 services in total available to customers. Machine learning (ML), principally through the SageMaker portfolio, comprises more than one in nine of these, and was an area where AWS’ view ‘continued to evolve’, Jassy said.

This breadth of portfolio, according to the Gartner analysis, is both a gift and a curse. While AWS’ offerings cater across myriad industries and their journeys, in terms of whether you have ML skills or not, Gartner noted the complexity ‘poses some challenges’ both for individual developers and enterprise application leaders.

Google, meanwhile, has previously touted itself as the cloud leader in machine learning, with a lot of initial advantage being built up through the leadership of Fei-Fei Li, who left in late 2018 to continue her work at Stanford University. Like AWS, many of its customer wins – Sainsbury’s being a good example in the hotly-contested retail market – cite ML front and centre as the key differentiator. As regular readers of this publication will be aware, Google Cloud has significantly increased its press output, with new customers, initiatives and partnerships appearing on an almost daily basis.

This is how Gartner summed up Google’s offering. The company was praised for its depth of languages available, its AutoML and image recognition tools, but its drawbacks were tied around its lesser standing in cloud infrastructure compared with AWS and Azure. Thomas Kurian’s leadership ‘attracted positive feedback… but the organisation is still undergoing substantial change, the full impact of which will not be apparent for some time’, Gartner wrote.

Microsoft, sitting in between AWS and Google, had its analysis presented in a similar feel, going relatively quietly about its business. The company’s cloud AI developer services were ‘among the most comprehensive on the market and all highly competitive’, according to Gartner, but the analyst firm cautioned its branding strategy was ‘confusing’ due to multiple business units, Azure, Cortana, and more.

The visionaries section at bottom right, where vision is praised but scale is not quite there yet, is often an interesting marker for future stars. Aible, a business intelligence (BI) platform provider which promises to automate the AutoML lifecycle, noted Gartner’s verdict that visionaries were ‘likely to excel’ in AutoML, a segment viewed as the most important for application leaders and development organisations.

Ultimately however, Gartner’s cloud AI developer services Magic Quadrant has a somewhat similar feel to its cloud IaaS ranking – at least on the top right, anyway. The full list of vendors analysed were Aible, Amazon Web Services, Google, H20.ai, IBM, Microsoft, Prevision.io, Salesforce, SAP, and Tencent.

Disclosure: CloudTech procured the Quadrant through the AWS landing page – you can do so too by visiting here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.