VMworld 2018: Multi-cloud strategies, AWS partnership blossoms, vSAN and NSX updates, and more

At VMworld in Las Vegas, VMware CEO Pat Gelsinger, with the help of some of the best and brightest in the cloud industry, expanded on partnerships and products, as well as the evolution of the multi-cloud landscape.

For the second year running, Andy Jassy, CEO of Amazon Web Services (AWS), took to the stage to give an update on AWS’ growing partnership with VMware, alongside new features. The announcement of an expansion of VMware Cloud on AWS to Asia-Pacific was good – Jassy told the audience the service would be ‘largely’ across all regions, including GovCloud, by late 2019 – but even better was the announcement of Amazon Relational Database Services (RDS) on VMware.

“You’ll be able to provision databases, you’ll be able to scale the compute, or the memory, or the storage for those database instances, you’ll be able to patch the operating system or the database engines,” said Jassy. “I think it’s very exciting for our customers and I think it’s also a good example of where we’re continuing to deepen the partnership and listen to what customers want, and then innovate on their behalf.”

The service, which will be available in a few months, aims to take the capabilities of setting up relational databases in the cloud, but manage them on VMware’s on-premises environment. If users decide these databases would be better longer-term on AWS, then there is a smooth migration path in place.

Jassy noted MIT as a key customer on the primary use case of VMware Cloud on AWS – migrating on-premises applications to the cloud. The university has been able to migrate 3000 VMs from their data centres to the companies’ solution, taking only three months to do so.

The partnership between VMware and AWS, first announced two years ago and updated last year, is evidently blossoming. So much so that VMware half-borrowed a concept from a previous AWS keynote for its own. Whereas Jassy framed his 2016 re:Invent speech on superpowers – supersonic speed, immortality, x-ray vision – and how AWS seemingly enables them, Gelsinger focused on tech superpowers – cloud, mobile, AI/ML, and edge/IoT.

More importantly, while these technologies are changing the way we live and work, they work even better in tandem. “We really see that each one of them is a superpower in their own right, but they’re making each other more powerful,” said Gelsinger. “Cloud enables mobile connectivity, mobile creates more data, more data makes the AI better, AI enables more edge use cases, and more edge requires more cloud to store the data and do the computing.

“They’re reinforcing each other – these superpowers are reshaping every aspect of society.”

VMware’s long-standing vision, therefore, of ‘any device, any application, any cloud’, with intrinsic security, plays into this. Much was spoken about the VMware Cloud Foundation, which the company sees as being “the simplest path to the hybrid cloud”, as Gelsinger put it. The quickest way to get there is through hyperconverged infrastructure, to which VMware has announced an update to vSAN, to ease adoption through simplified operations and efficient infrastructure. The figures touted at the keynote – more than 15,000 customers and 50% of the Global 2000 – are impressive; Gelsinger said it was “clearly becoming the standard for how hyperconverged is done in the industry.”

Another product announcement, this time multi-cloud flavoured, came in the form of upgrades to the VMware NSX networking and security portfolio. According to the company’s earnings call last week, more than four in five of the Fortune 100 have now adopted NSX. NSX-T Data Center 2.3, which is expected to be available before November, will again aim to give customers greater ease of deployment, as well as extend multi-cloud networking and security to AWS and Microsoft Azure, ‘empowering customers that operate across multiple public clouds to take advantage of local availability zones and the unique services of different cloud providers’, as the press materials put it.

One more multi-cloud themed piece of news was that VMware had acquired Boston-based cloud service management provider CloudHealth Technologies. CloudHealth offers a variety of capabilities in its platform, from streamlined billing to scaling out, enabling organisations to manage their workloads across AWS, Microsoft Azure, Google Cloud Platform and VMware.

The company said last June, on the raising of its series D funding, that the IPO market was something it would watch as it aimed to ‘become the anchor company at the centre of the Boston software technology ecosystem for decades to come.’ Earlier this year CloudHealth announced ‘significant investments’ into Europe. Writing in a blog post, founder and CTO Joe Kinsella said that he ‘was gratified to learn early on in discussions that VMware and CloudHealth Technologies share a most important strand of corporate DNA: customer-first.’

“As part of VMware, we will be able to serve you better and offer you a richer set of choices to support your business transformation in the cloud,” Kinsella added.

Ultimately, Gelsinger sees multi-cloud as ‘the next act’ in VMware’s 20-year history, after the server era, BYOD, the network, and cloud migration. The VMware CEO cited a survey from Deloitte which stated the average business today was using eight public clouds.

“As you’re managing different tools, different teams, different architectures – how do you bridge across?” he asked. “This is what we will do in the multi-cloud era – we will help our community to bridge across and take advantage of these powerful cycles of innovation that are going on, but be able to use them across a consistent infrastructure and operational environment.”

You can check out the full list of VMworld news here.

Picture credits: VMware/Screenshot

Cloud hyperscaler capex broke $53 billion for the first half of 2018, says Synergy Research

The capital expenditure of the largest cloud infrastructure players continues to rise – and according to Synergy Research, the first half of 2018 has seen record figures being published.

Total capex for the first half of this year among the hyperscale operators hit $53 billion (£41.1bn), compared with $31bn this time last year. Q2’s figures did not quite match Q1, but this is down to an anomaly, Synergy argues, with Google confirming its buying of Manhattan’s Chelsea Market building, for $2.4bn, in March.

The top five spenders remain Google, Microsoft, Facebook, Apple, and Amazon – and have been for the past 10 quarters. Between them, these five companies account for more than 70% of hyperscale capex. Only in the third quarter of 2016 did overall spending break $15bn (below), with a particular ramp over the past 12 months.

It is worth noting too that the list of highest level players has been trimmed, from 24 to 20. Synergy explains that this is down to a variety of factors; from some companies being subsumed into others’, such as LinkedIn, to others not spending enough on capex to justify inclusion. In some cases, this is because they are moving more of their workloads onto AWS and Azure, to the detriment of their own data centre footprint.

Regardless, this is yet another indicator that the largest players in cloud infrastructure are not resting on their laurels. As this publication has reported, Google has looked to expansion in Finland and Singapore in the past three months, while Microsoft, in an experimental move, put a data centre underwater off the Orkney Islands.

“Hyperscale capex is one of the clearest indicators of the growth in cloud computing, digital enterprise and online lifestyles,” said John Dinsdale, a chief analyst at Synergy. “Capex has reached levels that were previously unthinkable for these massive data centre operators and it continues to climb.

“The largest of these hyperscale operators are building economic moats that smaller competitors have no chance of replicating,” Dinsdale added.

As the financial sector embraces cloud technologies – what are the critical factors for success?

Today, in order to maintain competitive advantage, financial institutions need to be increasingly agile and quick in how they respond to fast-changing customer expectations and ultimately beat their competitors.

To this point, last month the EBA – European Banking Authority  published a Report on the Prudential Risks and Opportunities Arising for Institutions from Fintech. The report provides an analysis of the risks and opportunities relating to the adoption of new innovative technologies, providing seven fintech use cases, one of which is focused on outsourcing core banking and payment systems to the public, hybrid and private cloud.

The report looked at how cloud computing, which is an important enabling technology, is being leveraged by financial institutions to deliver innovative financial products and services.  In particular it highlights that in recent years there has been increasing interest from institutions in working with cloud service providers. And although that interest was initially focused on migrating non-core applications to the cloud, the EBA found that many financial institutions are now exploring how to migrate core mission critical systems to the cloud.

The report goes on to talk about how flexibility, scalability and agility are seen as the main benefits of public cloud, but adds that most cloud services have been standardised in order to allow services to be provided to a large number of customers in a highly automated manner on a large scale.

The underlying concern of course is that in such a security‑intensive and highly‑regulated industry, no one size ‘cloud’ fits all. So while it’s key that cloud providers standardise to very high service standards, those who also provide specialised service offerings and keep themselves open to individual use cases and customers’ requirements – e.g., for mission critical workloads ‑ clearly have an edge. This is precisely what Virtustream was built for, combined with a very high level of automation which reduces human intervention in the most complex IT operation processes, increasing efficiency and lowering risk exposure.  

The EPA report goes on to outline two main criterion that need to be met to ensure financial institutions are making the move to cloud correctly. These include “choosing the right cloud service partner (CSP) on its journey” and “ensuring the internal organisation can meet the needs for this transformation alongside its CSP partner”.

Choosing the right CSP

Financial institutions must carefully select the CSP that is right and suitable for their needs. This will depend on the project in question, the institution’s overall strategy and the regulatory requirements that the organisation must meet. The organisation must also consider what data is appropriate and necessary to migrate to the cloud; remembering that they don’t necessarily need to take an ‘all or nothing’ approach to cloud services. Likewise any CSP that an institution works with must have a firm understanding of the relevant compliance landscape. It is important to be able to demonstrate that a judgment call can be made when required. For example this involves documenting the reasonable action that has been taken to prevent or mitigate a data breach or loss, creating a full ‘audit trail’ and evidence of the company’s compliance.

This is where the CSP must have the deepest and broadest expertise on what it takes to migrate complex mission critical systems to the cloud, as we know quite well at Virtustream, having undertaken thousands of such migrations including the creation of an L3 extension of our users’ private data centres into our cloud nodes and integrating with their existing system monitoring and management tools via a broad set of APIs.

Likewise it is really important that the CSP is not only experienced but has a robust methodology and operating model. For example, in addition to our advisory services at Virtustream we also take a greatly optimised approach to cloud onboarding, migration and operation that includes:

  • Assessment: Identifying all workloads across the application landscape, in order to analyse system configurations and interdependencies with an estimate of initial cost benefits
  • Onboarding: Project planning and management, documentation of all applications and workloads, determination of the move sequences and thorough testing in order to identify any risks and issues, in order to finalise a full cutover plan
  • Migration: The actual migration of production systems, technical checks for data consistency, conversion to production operations. GoLive™ migration checks, handover and transition to steady-state
  • Managed services: A range of flexible choices which include infrastructure managed services and application managed services.  We also have expertise in a wide variety of databases, these include physical‑to‑virtual and virtual‑to‑virtual migrations, and database management

The role of IT teams

The report also went on to outline how the role of IT staff in financial institutions could possibly undergo a significant transformation with increased cloud outsourcing services, whereby roles convert into support and consultation for cloud service selection, engagement and management. This is where the adoption of an enterprise‑class cloud provider with managed public cloud services that deliver private cloud attributes is really important, as this strategically enables a new operating model for IT; one that is based on business outcomes and has close alignment between IT and the business.

What I mean by this is having an operating model in place that delivers the ability to quickly implement new ideas so that the organisation can tap into new revenue streams and acquire new customers; a model that lowers complexity and – with that ‑ also actively improves the risk posture.

Adopting a cloud operating model across all areas of the business is probably the most difficult part of the transformation. The key aspect to remember here is that it means working more closely with the business; it means adopting an IT operating model that is services and software product-oriented, not technology or project-oriented.

Looking to the sky

As cloud services become more integral to the whole organisation, so CSPs are going to quickly become part of the financial/banking infrastructure. However the risks involved in outsourcing data to the cloud carry wider potential consequences for any financial institution. This is why it is so important that regulatory bodies such as the EBA are able to respond to changes in the use of cloud and can continue to place strict compliance requirements on financial institutions and their partners.

To their credit, many CSPs have started to accept this as part of their ‘joint responsibility’ when they engage with a financial institution, but as cloud adoption continues to grow, financial institutions will need to carefully plan for and monitor their compliance, while CSPs look to provide an adaptable framework – one that is agile and able to flex to meet the ever-evolving needs of the finance industry.

Editor’s note: Find out more about the report and read it here (pdf, no opt-in).

How to Save Your Work with Snapshots in Parallels Desktop

Parallels Desktop® for Mac has a delightful functionality called Snapshots, which helps you save your virtual machine’s state to ensure your work environment is backed up and protected. This functionality has been part of Parallels Desktop since version 3. It allows users to restore their VM environment to a previous state in case of issues.  […]

The post How to Save Your Work with Snapshots in Parallels Desktop appeared first on Parallels Blog.

Monitor Your Virtual Machine Performance with Parallels Desktop

Parallels Desktop® for Mac is dedicated to ensuring users have a seamless experience with their virtual machine performance. It’s very common for Microsoft Windows applications to use a generous amount of CPU power. These resource-hungry programs can range from anti-virus, to Windows Updates installations, to cloud storage syncs such as OneDrive or Dropbox. In the […]

The post Monitor Your Virtual Machine Performance with Parallels Desktop appeared first on Parallels Blog.

The Parallels Desktop 14 Tech Guarantee

Parallels is committed to delivering great customer service to both our loyal customers and the newest members of the Parallels community. We’d like to explain our Parallels Desktop® for Mac Tech Guarantee for 2018, and answer questions that users may have about running Windows, Linux, and other popular OSes on Mac® without rebooting. In Parallels […]

The post The Parallels Desktop 14 Tech Guarantee appeared first on Parallels Blog.

David Linthicum’s #Serverless Session Opens with Record Registrations | @CloudEXPO @DavidLinthicum #DevOps #CloudNative

It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. That means serverless is also changing the way we leverage public clouds. Truth-be-told, many enterprise IT shops were so happy to get out of the management of physical servers within a data center that many limitations of the existing public IaaS clouds were forgiven. However, now that we’ve lived a few years with public IaaS clouds, developers and CloudOps pros are giving a huge thumbs down to the constant monitoring of servers, provisioned or not, that’s required to support the workloads.

read more

The quantum computing forecast: Services to reach $15bn by 2028

For decades, Moore's Law has been driving the advancement of traditional computing systems development, leading to the proliferation of smart devices at the edge and centralised cloud computing. However, we're now reaching the limits of legacy computing technology, and the search for another high-performance computing platform has begun. Quantum computing is the next-generation.

Total revenues generated from quantum computing services will exceed $15 billion by 2028, according to the latest worldwide market study by ABI Research. Most of these services will be cloud-based offerings.

Quantum computing market development

The demand for quantum computing services will be driven by some process hungry research and development projects as well as by the emergence of several applications including advanced artificial intelligence algorithms, next-generation cyber security encryption, traffic routing and scheduling, protein synthesis, and/or the design of advanced chemicals and materials.

These applications require a new processing paradigm that classical computers, bound by Moore’s law, cannot support. However, one should not expect quantum computers to displace traditional computing systems on-premises anytime soon.

Unlike classical computers, based on sequential processing principles, quantum computers leverage their strengths from two fundamental characteristics inspired from quantum physics — as an example,  entanglement and superposition — which make them super powerful for undertaking certain tasks, notably inter-correlated events that need to be executed in parallel.

"Classical computing is not dead, even in the post-Moore’s law era," said Lian Jye Su, principal analyst at ABI Research. "These machines will remain the ultimate processing power for executing traditional tasks such as text, video, speech processing, and signal processing, but will be potentially challenged by quantum machines when it comes executing algorithms that require massive parallel processing."

Quantum computing is, however, still in its embryonic stage of development and is not ready for large-scale commercial deployment anytime in the near to mid-term. Scalability, technology stability, reliability, and cost efficiency are the major factors the industry should address before seeing quantum computers moving beyond lab projects or very restricted and constrained commercial deployment.

According to the ABI assessment, the attempts to create quantum computers that are stable and have low error rate require heavy investment in infrastructure, software development, and human expertise.

The operation is currently performed under extreme low-temperature, high magnetic field, and in a vacuum or sterile environment, making the technology extremely difficult to scale and expensive to operate.

It's therefore not surprising that quantum computing is unlikely to achieve the distribution level of classical computers anytime within the next 10 years. The technology will remain concentrated in the cloud domain for many years to come.

"While the industry explores various hardware implementation methods by exploiting different quantum physics phenomena, they all face the harsh reality of tradeoffs, having to find the right balance between maintaining long coherence time, reducing error rates, minimizing cost, and developing scalable products," said Su.

Outlook for quantum-as-a-service

Therefore, excessive cost and extremely restrictive physical implementation will most likely limit quantum computing technology to federal government and military agencies, as well as major enterprises and the established hyperscale cloud computing providers. That said, the technology will also be made available to the general public via an 'as-a-service' business model.

ABI analysts believe that the future of cloud computing will increasingly rely on parallelism as new types of sophisticated applications and algorithms emerge. The IT infrastructure industry will need to deploy more efforts to accelerate the development of quantum computers as alternatives to their classical computer counterparts.

Cloud adoption in EMEA continues healthy growth – but security not on the same page

Cloud adoption in EMEA continues to grow significantly, but security awareness is not growing with it, according to new research from Bitglass.

The cloud access security broker (CASB), in its latest report, used company domains to identify cloud apps deployed in 20,000 organisations across Europe.

Less than half (47%) of organisations analysed had a single sign on (SSO) tool in use, according to the data; while this is higher than the one in four companies globally who use it, it still means serious room for improvement, according to Bitglass. SSO was seen as most widely adopted in education – as with 64% of respondents – biotech (54%), healthcare (53.7%) and finance (53.5%).

Practically every EMEA organisation analysed had deployed more than one cloud app, with many having at least a productivity app, file sync and share and cloud messaging platforms, alongside infrastructure as a service (IaaS).

When it came to AWS, adoption in EMEA is far exceeding global usage rate. Worldwide, only 13.8% use AWS, compared to 21.8% for EMEA. “Many firms in EMEA are also early adopters, willing to try new methods of custom app deployment like AWS that seem promising and are growing rapidly.” In terms of software, Office 365 continues to outpace G Suite, with 65% and 19.2% adoption in 2018 a marked change from 2016 (43% and 22% respectively).

“The results of this survey reinforce what we found in our 2016 study,” said Rich Campagna, CMO at Bitglass. “Organisations in EMEA are embracing cloud productivity apps but still lack the security tools necessary to protect data.

“In cloud-first environments, security must evolve to protect data on many more endpoints and in many more applications,” added Campagna.

According to another piece of research issued by CenturyLink this week, the cloud computing market is forecast to reach $411 billion by 2020. In Germany, the cloud services segment in Germany alone – defined as SaaS, PaaS and IaaS – is predicted to hit more than $20bn.

Five ways to step up your cybersecurity: The power of the cloud to combat threats

The latest entry in a never-ending series of data breaches comes courtesy of popular internet platform Reddit. The company revealed that a hacker was able to access usernames, passwords, and email addresses by intercepting SMS two-factor authentication.

It's sad to say, but “giant company suffers massive data breach” has become all too commonplace in our news feeds — to the point that too many organisations are tuning out important lessons. So let’s get a little more personal.

Imagine a business leader who relies on two-factor authentication with SMS to protect his personal and corporate accounts. Unbeknownst to him, a threat actor phishes the executive’s phone number from overseas via a technique called SMiShing. Like email phishing attacks, SMiShing uses text messages to trick users into providing personal information such as passwords or usernames.

The attacker then determines the mobile carrier and transfers the phone number to a different global carrier. Then, he uses the phone number to authenticate password resets and eventually gain access to personal and corporate data.

Think that sounds far-fetched? I’ve witnessed this exact situation — or at least the ugly aftermath. The amount of time, money, and effort it took to help this individual recover data and regain access to his accounts and device could have all been avoided if not for an outdated cybersecurity recommendation.

Of course, all organisations must weigh risk and reward. Using two-factor authentication with SMS is better than not using two-factor at all, for instance. No amount of security will mitigate 100 percent of threats, but business and IT leaders must work together to determine which security controls are necessary, affordable, and worth the time to mitigate risks without hampering productivity and efficiency.

Emergent threats and evolving defences

Recent years have seen a much-needed systemic shift from away from the “set it and forget it” mentality. Historically, a firewall was installed, configured, and forgotten. According to the National Vulnerability Database, most firewalls have had at least two critical vulnerabilities in the past year. Like firewalls, IT teams must routinely check security policies to ensure new threats cannot exploit older weaknesses. Because most technology departments lack the bandwidth or experience necessary, managed security services have become increasingly commonplace.

The solutions you implement should complement the structure and working environment of your business, as needs change based on whether your employees work on-site or remotely. Either way, common-sense strategies and affordable tools can protect your business from a host of cyberthreats. Start with these five steps to improve your cybersecurity posture:

Use the power of the cloud to combat threats

The key to effective security is simply knowledge — knowing what your employees and organisation are doing is the key to proper security. The cloud has become a helpful resource in this sense due to the numerous privacy controls it offers to streamline protection across devices and corporate identities.

Cloud app security services are able to identify applications and services used by all devices on your network, allowing you to know exactly what users are doing on your network. With appropriate security in place, companies are able to investigate early and prevent breaches that could otherwise go unnoticed for months.

Create defence with a unified threat management system

There are a number of vectors that can leave an organisation vulnerable to cyberattacks, data compromise, or data loss: website visits, endpoint vulnerabilities, email phishing, and user error, to name a few. Adopt a defence-in-depth approach that deploys a holistic strategy via several tools, including cloud web filtering, endpoint protection, and unified threat management (UTM).

UTM can be tailored to your company’s needs, but it generally incorporates features such as a next-generation firewall, anti-virus, intrusion detection, web filtering, and protection against spam and spyware. A UTM system provides a more centralised approach to security management and superior protection while reducing associated installation and upkeep costs.

Invest in skilled IT staff and partners

To stay on top of potential threats, companies must invest in their cybersecurity capabilities. One of the most important priorities is designating specific IT personnel to manage security and data protection. This means individuals who have the certifications, knowledge, and capacity to truly understand the complexity of data protection, legal requirements, and technical controls. Tasking a system admin with cybersecurity — among a long list of other duties — will not cut it.

IT staff members often juggle more responsibilities than there are hours in the day, but a hybridised or fully outsourced IT model can help them stay on top of these endless obligations. If you take this route, make sure any partner you choose has the right skill set, certifications, and experience.

Train employees

Insiders still pose a significant threat to your sensitive information — whether malicious or unintentional. According to the Verizon "2018 Data Breach Investigations Report," more than one-fourth of attacks involved insiders.

Foster a workplace culture that prioritises data protection, reinforces safe practices, and teaches employees how to identify common phishing schemes and dangerous downloads. Find engaging and interactive ways of teaching team members about cybersecurity. Consider incorporating your marketing team to leverage social and internal communications platforms to get security tips and information out in a visual and fun way. Some businesses are even going as far as phishing their own employees.

Create a thorough business continuity plan

A proactive framework includes a recovery and business continuity plan that ensures you can get your business back up and running if you do fall victim to an attack. This plan should include data backup and disaster recovery in addition to an executive-level strategy involving cybersecurity policies, insurance requirements, regulatory responses, and even public relations.

Cybersecurity is no longer about building a firewall and sporadically running antivirus. For optimal outcomes, organisations need an innovative defence-in-depth strategy with the resources to manage it all. The above five steps are a good place to start. Cyber threats are constantly changing, which means defences must evolve even faster. If security is top of mind, you’re headed in the right direction.