Todas las entradas hechas por James

Google Cloud launches Cloud Dataproc on Kubernetes in alpha

Google Cloud has announced the launch of Cloud Dataproc on Kubernetes, adding another string to the bow for the product which offers a managed cloud service for running Apache Spark and Hadoop clusters.

Google – which originally designed Kubernetes before handing it to the Cloud Native Computing Foundation (CNCF) – is promising ‘enterprise-grade support, management, and security to Apache Spark jobs running on Google Kubernetes Engine clusters’, in the words of a blog post confirming the launch.

Christopher Crosbie and James Malone, Google Cloud product managers, noted the need for Cloud Dataproc to utilise Kubernetes going forward. “This is the first step in a larger journey to a container-first world,” Crosbie and Malone wrote. “While Apache Spark is the first open source processing engine we will bring to Cloud Dataproc on Kubernetes, it won’t be the last.

“Kubernetes has flipped the big data and machine learning open source software world on its head, since it gives data scientists and data engineers a way to unify resource management, isolate jobs, and build resilient infrastructures across any environment,” they added. “This alpha announcement of bringing enterprise-grade support, management, and security to Apache Spark jobs on Kubernetes is the first of many as we aim to simplify infrastructure complexities for data scientists and data engineers around the world.”

To say Kubernetes is not a major priority for both vendors and customers would be something of a falsification. The recent VMworld jamboree in San Francisco two weeks ago saw the virtualisation giant launch a major attack on the product, with the primary launch being VMware Tanzu, a product portfolio which looked at enterprise-class building, running and management of software on Kubernetes.

As this publication put it when KubeCon and CloudNativeCon hit Barcelona back in May, it was a ‘milestone’ for the industry. Brian Grant and Jaice Singer DuMars certainly thought so; the Google Cloud pair’s blog post at the time agreed Kubernetes had ‘become core to the creation and operation of modern software, and thereby a key part of the global economy.’

The goal now is to get the most out of it, whether you’re an enterprise decision maker or developer alike. Writing for CloudTech last month Ali Golshan, co-founder and CTO at StackRox, noted the acceleration in user deployments. “Despite the fact that container security is a significant hurdle, containerisation is not slowing down,” Golshan wrote. “The advantages of leveraging containers and Kubernetes – allowing engineers and DevOps teams to move fast, deploy software efficiently, and operate at unprecedented scale – is clearly overcoming the anxiety of security concerns.”

Golshan also noted, through StackRox research, that Google still ranked third among the hyperscalers for container deployments in the public cloud but had gained significantly in the past six months.

“Enterprises are increasingly looking for products and services that support data processing across multiple locations and platforms,” said Matt Aslett, research vice president at 451 Research. “The launch of Cloud Dataproc on Kubernetes is significant in that it provides customers with a single control plane for deploying and managing Apache Spark jobs on Google Kubernetes Engine in both public cloud and on-premises environments.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft expands European Azure presence with Germany and Switzerland launches

Microsoft has announced the launch of new Azure availability in Germany and Switzerland, citing increased data residency and security concerns as key to European expansion.

Azure is now available from cloud data centre regions located in Zurich and Geneva, for the Switzerland release announced at the end of last month, while Germany’s newest regions are in the North and West Central zones, in Berlin and Frankfurt respectively.

The communications announcing the Germany and Switzerland releases, from Azure corporate vice president Tom Keane, were almost identical, save swapping out a customer story here and stock photo there. Among Microsoft’s German customers are Deutsche Bank, Deutsche Telekom and SAP, while Swiss companies utilising Azure include Swisscom, insurance firm Swiss Re, and wealth manager UBS Group.

Customers in Germany are promised compliance specific to the country, including C5 (Cloud Computing Compliance Controls Catalogue) attestation. Alongside Office 365, Dynamics 365 and Power Platform, customers will be able to benefit from containers, Internet of Things (IoT), and artificial intelligence (AI) solutions, Microsoft added.

“These investments help us deliver on our continued commitment to serve our customers, reach new ones, and elevate their businesses through the transformative capabilities of the Microsoft Azure cloud platform,” wrote Keane.

Microsoft is not the first hyperscaler to hoist its flag atop Switzerland, with Google opening its Zurich data centre region back in March. Both Google and AWS have sites in Frankfurt, with AWS first to launch there back in 2014.

This is not the end of the European expansion for Microsoft, with two new regions in Norway planned. The sites, in Stavanger and Oslo, are set to go live later this year.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

New initiative aims to create ‘first ocean-powered data centre’ in Scotland

Remember Project Natick, Microsoft’s experiment last year in placing a data centre underwater off Orkney? A little further down the Scottish coast, another company is looking to use the ocean waves to create sustainability in infrastructure – but through a slightly different method.

SIMEC Atlantis Energy is looking to build the first ‘ocean-powered data centre in the world’ in Caithness, with the aim of attracting a hyperscale cloud infrastructure provider for its hosting needs. The facility will utilise electricity from a private wire network from tidal turbines at MeyGen, an existing project site, as its power supply.

“The MeyGen project has a seabed lease and consents secured for a further 80MW of tidal capacity, in addition to the 6MW operational array which has now generated more than 20,000MWh of electricity for export to the grid,” the company notes in its press materials.

SIMEC Atlantis is looking to partner with engineering firm AECOM to assess the feasibility of the project, with particular regard to connectivity, with the target date for operations set at 2024. The company noted a smaller initial data centre module could be deployed sooner.

“This exciting project represents the marriage of a world-leading renewable energy project in MeyGen with a data centre operator that seeks to provide its clients with a large amount of computing power, powered from a sustainable and reliable source – the ocean,” said Tim Cornelius, SIMEC Atlantis CEO. “At MeyGen we have many of the ingredients to provide clean power to the data centre, including a large grid connection agreement, proximity to international fibre optic connections and persistent cool weather.”

Cornelius added that Scotland can ‘play a key role in the global data centre industry.’ This is based upon the dual advantage of a more temperate climate and access to clean energy. Speaking to the BBC last year around Project Natick, Microsoft confirmed Orkney’s location was chosen primarily because of its renewable energy expertise.

Scandinavia has seen various energy-efficient initiatives taking place, many taking advantage of its suitable geography. Last year, Nordics-based provider DataPlex announced it was reusing wastage from its data centre facilities to heat apartments in Oslo. Last month, the company launched a guide to help businesses solidify their data centre strategies – with sustainability a key message. Various stakeholders are involved; as far back as 2015, this publication reported on a study in Sweden – later passed as legislation – to give tax breaks on electricity for data centre providers.

In terms of the biggest cloud providers, Google announced last April it was running all of its clouds on renewable energy. Amazon Web Services (AWS) is not at that level yet however, announcing in April new projects with the goal of achieving 100% renewable energy for its global infrastructure. A report from Greenpeace at the time however argued some of AWS’ data centres were running off as little as 12% renewable energy.

Writing for this publication in July, Hiren Parakh, senior director of cloud services EMEA at OVH, noted the key trends emerging to create sustainability in the data centre industry. “Through a fully integrated industrial model, providers are capable of building systems that are more energy efficient and should always thrive to optimise the use of data centres and server resources across their customer base,” wrote Parakh. “When it comes to managing and fitting out a data centre, it’s clear that sustainability needs to be top of mind.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Microsoft to acquire cloud migration tool provider Movere

Microsoft is to acquire Movere, a SaaS platform which increases visibility on IT environments, the companies have announced.

Movere – whom industry watchers may remember until last year as Unified Logic – aims to ‘capture, integrate and analyse the data [companies] need to make smart decisions about their IT environment’, as the company puts it.

The company’s dashboard organically scans global environments at a highest rate of 1,000 servers per hour and focuses across multiple parts of the cloud migration journey, as well as cybersecurity.

Movere has been a partner of Microsoft for more than 10 years and will join the Azure team as part of Azure Migrate, according to a Microsoft blog post.

“We’re committed to providing our customers with a comprehensive experience for migrating existing applications and infrastructure to Azure, which include the right tools, processes, and programs,” wrote Jeremy Winter, partner director for Azure management. “As part of that ongoing investment, we’re excited to welcome the leadership, talent, technology, and deep expertise Movere has built in enabling customers’ journey to the cloud over the last 11 years.”

For Kristin Ireland, CEO of Movere, the acquisition was a time of reflection on the company’s journey to date.

“On our journey to cloud, we made mistakes that cost us valuable time and resources that we didn’t have,” Ireland wrote. “As we spread our wings in the cloud, we realised the cloud was the embodiment of Movere – the unleashing of business potential through migration – we knew we had to be part of that journey for as many customers as we could.

“We passionately believe the cloud journey is what opens the door to market disrupting ideas and opportunities; to be part of that journey with customers and partners is a privilege,” added Ireland. “Thank you to our partners and customers for allowing us to be part of that journey thus far; we are so excited to continue to be faster and better for you as part of the Microsoft Azure team.”

The move represents the first cloudy acquisition of 2019 for Microsoft, aside from investing in big data analytics platform Databricks back in February. This year has been relatively quiet thus far on the acquisition front from the big two; Amazon Web Services (AWS) acquired CloudEndure for a reported $250 million at the start of this year, with undisclosed offers for TSO Logic and E8 Storage since.

Google Cloud, meanwhile, has made more of a statement around its enterprise ambitions with three acquisitions. The company bought business intelligence platform Looker for an all-cash $2.6bn transaction in June, alongside deals for Alooma and Elastifile in February and July respectively.

Financial terms of the Microsoft and Movere deal were not disclosed.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Cloud security woes strike again – and it’s double trouble for multi-cloud users, research finds

A survey of C-suite executives from Nominet has found that, for more than half of respondents, cloud security remains a concern – which becomes even more critical when multi-cloud comes in.

The study, which polled 274 CISOs, CIOs and CTOs, found 52% were at least moderately concerned about security with regards to cloud adoption. One in five respondents said they were ‘very’ concerned, compared to one in 10 who said they were not at all concerned.

Almost half (48%) of those polled said their organisation had a multi-cloud approach. Yet respondents using a multi-cloud approach were significantly more likely to have suffered a data breach – 52% affirmed this compared with only 24% of hybrid cloud users.

When it came to the specific threats organisations face, respondents were most concerned over exposure of customer data, increased threat surfaces, and improving cybercriminal sophistication.

Almost two thirds (63%) of those polled said they already outsourced certain security services to managed providers. CNI, hospitality and transport were industries less likely to outsource some of their security operations. “Most organisations are happy to outsource when it comes to security, and appear to believe the practice improves their security profile,” the report notes.

The report naturally went through the rigmaroles of cloud adoption statistics, of which a selection is presented herein. The most interesting aspect was that Google Cloud proved the most popular choice of the big clouds, with 56% saying they used it. AWS (32%), perhaps even more interestingly, finished flat last, behind Azure (36%), Oracle (44%) and IBM (49%).

88% of survey respondents said their organisation was either currently engaged in, or planning to, adopt cloud and software as a service (SaaS). 71% overall said they had adopted SaaS, compared with IaaS (60%), PaaS (30%) and business process as a service (BPaaS – 30%). A quarter of respondents said they had function as a service (FaaS) installed.

“The maturity of the cloud means that not only are businesses willing to use it for the delivery of operations and IT services, they are also embracing it for security tools and managed services,” the report notes. “And as businesses look at how the cloud can help make them more secure, ease of integration is top of mind – whether that’s with on-premise applications or other cloud services.

“The move to the cloud won’t be an all-encompassing migration,” the report adds. “Businesses will want to make the most of existing investments and only adopt cloud alternatives once these have reached the end of their product lifecycle.

“Organisations today therefore need cloud security tools that are flexible enough to secure the enterprise as it is today, and as it will be tomorrow.”

You can read the full Nominet report here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

There is a downturn in cloud and data centre infrastructure spending – and China is causing it

Any regular reader of this publication will have noted the regularity in which the largest cloud players – Amazon Web Services (AWS), Microsoft Azure, Google Cloud et al – post solid quarterly financial results. While Wall Street may not have been happy with all of the postings, growth has remained, albeit dipping from the three figure climbs in previous years.

This hyperscaler growth has often been backed up with strong spending across hardware assets. Yet two research companies have noted a decline in the most recent quarters across their industry segments. Both have blamed downturns in China for the change, although it will by no means be an irreversible decline.

Synergy Research, a long-time cloud infrastructure market analyst, noted in August that hyperscaler capex was down 2% based on year-by-year figures. The most recent quarter saw more than $28 billion in spending.  The first quarter of this year, although nearer $25bn, followed a similar pattern. Q118’s figure was still above it, even accounting for the one-off spend of Google buying Manhattan real estate for $2.4bn.

China’s expenditure declined by 37% year on year in Q2, Synergy noted, with Alibaba, Tencent, JD.com and Baidu all reluctant to spend. All other areas saw nominal increases; the US saw the most with 5% yearly, ahead of EMEA (3%) and the rest of APAC (2%). Taking China out of the mix would see overall figures jump 4% year on year.

Synergy’s figures come from the data centre and capex footprint of 20 of the world’s largest cloud and internet service firms. The ‘big five’, in this instance Google, Amazon, Microsoft, Facebook and Apple, usually dominate.

“Usually it is the big five that dictate the scale and trends in hyperscale capex, but the drop-off in spending in China has been so marked that an otherwise strong worldwide growth story has been transformed into a modest capex decline,” said John Dinsdale, a chief analyst at Synergy.

This situation is echoed when it comes to data centre switches. According to telecom and network analyst Dell’Oro Group, ‘weakness in China’ suppressed data centre switch market growth in Q219. The decline was the first seen in five years, down to both a slowdown in spending from cloud service providers and enterprise, as well as continued uncertainty over Huawei.

“In contrast, data centre switch market revenue in North America managed to grow despite a slowdown in spending by major cloud service providers,” said Sameh Boujelbene, Dell’Oro Group senior director. “Most of the slowdown was driven by reduced server purchases while data centre switches performance well. Large enterprises also contributed to the growth in North America as they accelerated their 100 GER adoption and helped Cisco emerge as the new leader in 100 GE revenue in Q219.”

“The situation in China is likely to be a short-term phenomenon, however, as the four Chinese hyperscale operators continue to grow revenues more rapidly than their US-headquartered counterparts,” Dinsdale added. “After some short-term financial belt-tightening, we expect to see Chinese capex rise strongly once again.”

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How Abbots Care gained greater assurances around data security with a revamped DR and backup strategy

Case study All data is equal, but for some industries, data is more equal than others. As a result, great care needs to be taken when it comes to keeping that data secure, whether in the cloud or anywhere else.

Healthcare, across its various channels, is a classic example. Some healthcare organisations are moving with less trepidation towards the cloud. In February, for instance, a study from Nutanix found that, by 2021, more than one in three healthcare organisations polled said they would be deploying hybrid cloud solutions. At the start of this year, pharmaceutical giant Walgreens Boots Alliance selected Microsoft as its primary cloud provider, with the majority of its infrastructure moving across to Azure.

Regardless of where it is hosted, the non-negotiables for healthcare providers are that the data can be accessed to its demands and that it is unimpeachable.

Abbots Care, a home care company based in Hertfordshire, is like any responsible UK provider under the regulatory jurisdiction of the Care Quality Commission. As managing director Camille Leavold puts it, one data breach could mean the company’s licence is taken away.

Leavold therefore wanted more assurance of how secure her company’s data was – and as a result she turned to managed IT services provider Fifosys.

“About two years ago, we were at a stage where we had quite a lot of data,” Leavold tells CloudTech. “Although we were using a company that said our data was secure and safe, we actually didn’t have any way of being able to evidence that.

“Obviously we’re quite in a compliant sector, and we needed to be able to evidence it. That started us looking,” she adds. “We were also looking for a company that was 24/7, because we are too.”

Mitesh Patel, managing director of Fifosys, went through the standard detailed audit when the work originally went out to tender. Basic questions around the backing up of data, recovery times and sign-off process highlighted risks which ‘weren’t acceptable’ to Leavold, as Patel puts it. Fifosys’ solution ties in to the company’s partnership with business continuity provider Datto, whose technology, according to Fifosys technical director James Moss, is ‘effectively a mini-DR test every day.’

Fifosys runs two official recovery tests a year, with the results sent to Leavold who can then present them to the board. “It’s no longer something hidden where you’ve gone ‘okay, there’s a vendor dealing with it, we’re going to be blind to it,” Patel tells CloudTech. “The recovery process… they get a report, that’s discussed – is this timeframe acceptable? – [and] are there any tests they want to do outside of this?”

Like many healthcare providers, Abbots Care also needs a good ERP system to ensure all its strands are tied up – particularly with care workers out in the field, checking on their tablets and devices which patients they need to see, their medication, and the service which needs to be provided at that time. "There's a lot for Abbots Care that they need to have up and running, and when you're scheduling so many people out in the field, these systems need to be up," says Patel.

Another consoling aspect is that the company’s backup and disaster recovery is all in one place. “[If] you can’t answer the [audit] questions and you’ve got five or six different vendors involved in delivering your backup, your continuity, applications, recovery… it’s fine you’ve got these vendors in, but your recovery time is extended continuously,” explains Patel. “Who’s actually responsible? Whose neck is on the line in the event that something does happen?”

Outages are unfortunately a fact of life, as even the largest cloud providers will testify, but can be mitigated with the right continuity processes in place. “Continuity was a big, big part for them, and then it’s all in terms of protecting the data and having versions of it,” explains Patel.

“There are organisations who say they’ve got four sites, and [they’re] just going to replicate across those four sites and invest in the same infrastructure on all four. That’s very difficult to maintain, administer and manage,” Patel adds. “When you are testing, you find people are only testing one of their sites rather than all four.

“You should be doing four tests at least twice a year – but the time involved in doing that, many people underestimate [it] and then start compromising.”

You can find out more about the case study by visiting here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Alibaba, Google Cloud and Microsoft among inaugural members of cloud security consortium

The Linux Foundation has announced the launch of a new community of tech all-stars focused on advancing trust and security for cloud and edge computing.

The open source community, dubbed the Confidential Computing Consortium (CCC), has 10 initial members: Alibaba, Arm, Baidu, Google Cloud, IBM, Intel, Microsoft, Red Hat, Swisscom and Tencent.

“Current approaches in cloud computing address data at rest and in transit but encrypting data in use is considered the third and possibly most challenging step to providing a fully encrypted lifecycle for sensitive data,” the foundation noted in its press materials. “Confidential computing will enable encrypted data to be processed in-memory without exposing it to the rest of the system and reduce exposure for sensitive data and provide greater control and transparency for users.”

Members are encouraged to bring their own projects to the consortium, with Microsoft offering Open Enclave SDK, a framework which allows developers to build trusted execution environment (TEE) applications using a single enclaving abstraction. Intel’s Software Guard Extensions (SGX) SDK aims to help app developers protect select code and data from disclosure or modification at the hardware layer, while Red Hat’s Enarx provides hardware independence for securing applications using TEEs.

This is by no means the only cross-industry collaboration taking place in the cloud space right now. In March Intel led a launch of cohorts in a campaign to improve data centre performance through Compute Express Link (CXL), an emerging high-speed technology standard.

Alibaba, Google, and Microsoft are, alongside Intel, members of both initiatives. The three pretenders to the cloud infrastructure throne made all the right noises upon launch, with the three gifts of the Magi being looked upon with awe.

“We hope the [Open Enclave SDK] can put the tools in even more developers’ hands and accelerate the development and adoption of applications that will improve trust and security across cloud and edge computing,” said Mark Russinovich, Microsoft CTO.

“As the open source community introduces new projects like Asylo and Open Enclave SDK, and hardware vendors introduce new CPU features that change how we think about protecting programs, operating systems, and virtual machines, groups like the CCC will help companies and users understand its benefits and apply these new security capabilities to their needs,” said Royal Hansen, Google vice president for security.

The FAQ section also provides some interesting titbits. Under the question of ‘why does this require a cross-industry effort?’, the CCC responds with the following. “Of the three data states, ‘in use’ has been less addressed because it is arguably the most complicated and difficult. Currently confidential computing solutions are manifesting in different ways in hardware, with different CPU features and capabilities, even from the same vendor.

“A common, cross-industry way of describing the security benefits, risks, and features of confidential computing will help users make better choices for how to protect their workloads in the cloud,” it adds.

One notable absentee from the CCC party is Amazon Web Services (AWS). The launch, at Open Source Summit, may be something of a clue. While AWS promotes its open source initiatives through its @AWSOpen Twitter handle among others, several in the community feel differently about AWS’ relationship with open source players. The launch of DocumentDB, a database offering compatible with MongoDB in January caused TechCrunch to lead with the brazen headline that AWS had ‘[given] open source the middle finger’. Yet as reported by Business Insider in June, the company is increasingly ‘listening’ to the community.

You can find out more about CCC here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

VMworld 2019: Going big on Kubernetes, Azure availability – and a key ethical message

VMware has kicked off its 2019 VMworld US jamboree in San Francisco with a series of updates, spanning Kubernetes, Azure, security and more.

The virtualisation and end user computing giant issued no fewer than five press releases to the wires alone today, with CEO Pat Gelsinger and COO Sanjay Poonen continuing to emphasise the company's 'any cloud, any application, any device, with intrinsic security' strategy.

Chief of these on the product side was around VMware Tanzu, a new product portfolio which aims to enable enterprise-class building, running, and management of software on Kubernetes. Included in this is Project Pacific, which is an ongoing mission to rearchitect server virtualisation behemoth vSphere with the container orchestration tool. 

Except to call Kubernetes a container orchestration tool would be doing it a major disservice, according to Gelsinger. Not since Java and the rise of virtual machines has there been a technology as critical for cloud since Kubernetes, he noted, in connecting developers and IT ops. This is evidently the goal of Project Pacific as well. Tanzu, we were told, is both Swahili for branch – as in a new branch of innovation – and the Japanese word for container.

Other product news included an update to collaboration program Workspace ONE, including an AI-powered virtual assistant, as well as the launch of CloudHealth Hybrid by VMware. The latter, built on cloud cost management tool CloudHealth, aims to help organisations save costs across an entire multi-cloud landscape and will be available by the end of Q3. 

Analysis: Pick a cloud, any cloud

VMware's announcement of an extended partnership with Google Cloud earlier this month led this publication to consider the company's positioning amid the hyperscalers. VMware Cloud on AWS continues to gain traction – Gelsinger said Outposts, the hybrid tool announced at re:Invent last year, is being delivered upon – and the company also has partnerships in place with IBM and Alibaba Cloud.

Today, it was announced that VMware in Microsoft Azure was generally available, with the facility gradually being switched on across Azure data centres. By the first quarter of 2020, the plan is for availability across nine global areas.

The company's decision not to compete, but collaborate with the biggest public clouds is one that has paid off. Yet Gelsinger admitted that the company may have contributed to some confusion over what hybrid cloud and multi-cloud truly meant. The answer (below) was interesting. 

Increasingly, with organisations opting for different clouds for different workloads, and changing environments, Gelsinger honed in on a frequent customer pain point for those nearer the start of their journeys. Do they migrate their applications or do they modernise? Increasingly, customers want both – the hybrid option. "We believe we have a unique opportunity for both of these," he said. "Moving to the hybrid cloud enables live migration, no downtime, no refactoring… this is the path to deliver cloud migration and cloud modernisation."

As far as multi-cloud was concerned, Gelsinger argued: "We believe technologists who master the multi-cloud generation will own it for the next decade."

Customers, acquisitions, and partnerships

There were interesting customer stories afoot – particularly down to the scale and timeframe of their initiatives. FedEx has its fingers in many pies, from VMware Cloud on AWS, to VMware Cloud Foundation, to Pivotal – on whom more shortly. 

Research firm IHS Markit was what Gelsinger called "a tremendous example of a hybrid cloud customer." The company's goal was to have 80% of its applications in the public cloud, with the result of being able to migrate 1000 applications in six weeks. Poonen asked Freddie Mac about how many of its 600 apps the financier was migrating. The answer: all of them, bar a negligible few. The initiative started in February, and the plan was to be finished 'by Thanksgiving.'

On the partnership side, VMware announced a collaboration with NVIDIA for the latter to deliver accelerated GPU services for VMware Cloud on AWS. This was again cited as key to enterprise customers to make the most of their huge swathes of data, while the move also links in with VMware's acquisition of Bitfusion, enabling the company to efficiently make GPU capabilities available for AI and ML workloads in the enterprise.

Gelsinger made special note to mention VMware's most recent acquisitions, with Pivotal and Carbon Black being name-checked at the front of the keynote and brought on to discuss where they fit in the VMware stack at the back.

Analysis: Gelsinger's irresistible take on tech expansion

Pat Gelsinger is rapidly turning into a must-listen speaker with regards to the future of technology, its convergence, and its ethical effects. 

This is not to say previous VMworld talks aren't worth your time, of course. Gelsinger has done seven big US events now – seven and a half if you include 2012 when former CEO Paul Maritz was handing over the reins. Yet today he sits in a fascinating position across cloud, network and more, to assess the landscape.  

The opening 2019 VMworld keynote had everything one would expect; customer success stories by the cartload and product announcements by the pound, as seen above. Yet underpinning it was an ethical fix. "As technologists, we can't afford to think of technology as someone else's problem," Gelsinger told attendees, adding VMware puts 'tremendous energy into shaping tech as a force for good.'

Gelsinger cited three benefits of technology which ended up opening Pandora's Box. Free apps and services led to severely altered privacy expectations; ubiquitous online communities led to a crisis in misinformation; while the promise of blockchain has led to illicit uses of cryptocurrencies. "Bitcoin today is not okay, but the underlying technology is extremely powerful," said Gelsinger, who has previously gone on record regarding the detrimental environmental impact of crypto.

This prism of engineering for good, alongside good engineering, can be seen in how emerging technologies are being utilised. With edge, AI and 5G, and cloud as the "foundation… we're about to redefine the application experience," as the VMware CEO put it. 

2018's keynote was where Gelsinger propagated the theme of tech 'superpowers'. Cloud, mobile, AI, and edge were all good on their own, but as a cyclical series could make each other better. This time, more focus was given to how the edge was developing. Whether it was a thin edge, containing a few devices and an SD-WAN connection, a thick edge of a remote data centre with NFV, or something in between, VMware aims to have it all covered.

"Telcos will play a bigger role in the cloud universe than ever before," said Gelsinger, referring to the rise of 5G. "The shift from hardware to software [in telco] is a great opportunity for US industry to step in and play a great role in the development of 5G." 

Reaction: Collaboration between VMware and the hyperscalers – but for how long?

If VMware was at all concerned about the impact of its AWS partnership, the company needn't have worried. One other piece of news was a research study put together alongside Forrester around the total economic impact of VMware Cloud on AWS. 

The study saw Forrester put together a composite organisation, based on companies interviewed, and assessed budgets and migration considerations. The average organisation had 80 servers, $2 million in annual software budgets and a 40 to one ratio of VMs to applications. The composite organisation saved 59% of operational costs in the cloud compared with equivalent capacity on-premises.

Speaking directly after the keynote Bruce Milne, CMO at hyperconverged infrastructure provider Pivot3, praised the strategic approach – but with something of a caveat.

"Pat Gelsinger in his keynote illustrated that [VMware] is not content to rest on its laurels and continue to push the boundaries of technology through development, acquisition and partnerships," Milne told CloudTech. "Software is eating the world, as the old maxim goes, and that was evident through his discussion. He challenged customers to look at the economic benefit of refreshing their hardware and saving money through virtualisation with NSX, [and] declared that VMware is serious about integrating Kubernetes into vSphere, which empowers app developers who may be developing for the cloud or on-prem.

"There's an obvious strategic tension in VMware's collaboration with the hyperscale cloud providers, but for now it appears they've agreed to a collaborative detente," added Milne. "Watch this space because that friction is sure to generate sparks eventually. VMware wants to be the facilitating platform for apps on 'any cloud' – clearly a space that the hyperscale vendors covet as well."

You can take a look at the full list of VMworld 2019 announcements here.

Postscript: A keenly raised eyebrow from this reporter came when Gelsinger referred to William Hill, an NSX customer, as an 'online gaming company'. UK readers in particular will note that this isn't quite telling the whole truth – but let's just say it emphasises the difference between holding your events in San Francisco and Las Vegas.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Cloud performance and change management cited in latest DORA DevOps analysis

The performance figures for those at the sharp end of DevOps implementation goes up and up – and cloud usage is increasingly becoming a key differentiator.

That’s the primary finding from the DORA Accelerate State of DevOps 2019 report. The study, which was put together alongside Google Cloud, is a major piece of work, covering responses from more than 31,000 survey respondents tracking six years of research.

The proportion of highest performing users has almost tripled according to the research, comprising almost 20% of all times. What’s more, the highest performing teams were 24 times more likely than lower performers to be cloud specialists. This was based on executing across five specialisms of cloud computing based on NIST guidelines; on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service.

One aspect which needed to be considered was around change management. Perhaps not surprisingly, formal change management processes – for instance requiring approval from an external Change Approval Board – has led to a downturn in software delivery performance.

“Heavyweight change approval processes negatively impact speed and stability,” the report notes. “In contrast, having a clearly understood process for changes drives speed and stability, as well as reductions in burnout.”

The report further argues that formal change approval processes – industry norms for years – do not aid stability. Research investigated whether a more formal approval process – in other words, fewer releases – was associated with lower change fail rates. Organisations naturally want to introduce additional processes if problems are encountered with software releases; yet this is a red herring, the report argues.

“Instead, organisations should shift left to peer review-based approval during the development process,” the report explains. “In addition to peer review, automation can be leveraged to detect, prevent, and correct bad changes much earlier in the delivery lifecycle. Techniques such as continuous testing, continuous integration, and comprehensive monitoring and observability provide early and automated detection, visibility, and fast feedback.”

Ultimately, the research gives a simple conclusion: slow and steady wins the race. “One thing is clear: a DevOps transformation is not a passive phenomenon,” the report notes. “No profiles report strong use of a ‘big band’ strategy – though low performers use this the most often – and that’s probably for the best.

“In our experience, this is an incredibly difficult model to execute and should only be attempted in the most dire of situations, when a ‘full reset’ is needed,” the report adds. “In the ‘big band’, everyone needs to be on board for the long haul, with resources dedicated for a multi-year journey. This may explain why this method is seen most often among our low performers.”

You can read the full report here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.