Microsoft and Oracle bring multicloud alliance to the UK


Rene Millman

28 Aug, 2019

Oracle and Microsoft have expanded their multicloud alliance to the UK, following the launch of the partnership in June.

Users of both companies’ clouds will be able to interconnect IT environments and applications that span both clouds. Vinay Shivagange Chandrashekar, vice president of product management at Oracle, said that linking cloud regions that are physically close to each other makes the interconnection more useful.

“Closer cloud resources means less latency, which enables better data transfer and application interaction between clouds, and supports a broader spectrum of workloads using resources on both sides. By enabling this interconnection in London, we’re opening the door for usage of this kind on a whole new continent,” he said, adding that London is “one of the most active Oracle Cloud regions”.

“By enabling a preconfigured, dedicated interconnection, common controls, integrated identity management, and support capabilities, we’re giving customers a roster of new services in Azure that they can use with the services that they use in our cloud,” he said.

He added that many customers that run in the UK can now deploy Oracle databases and applications as cloud services, and connect those services to applications on Azure that run the Microsoft stack. Joint customers can create a combination of services from each cloud, matching each part of their workload inventory to the optimal cloud for each, without added complexity or settling for an inferior environment for parts of what they run.

Chandrashekar said that before the cloud Oracle and Microsoft technologies could coexist effectively in customer data centres, with systems running each stack close enough for easy information exchange.

“The move to cloud broke this capability. Each vendor’s cloud was isolated from the others, making interchange between solutions on each difficult or impossible,” he said. “This alliance gives customers the ability to interconnect workloads from multiple vendors as they could in their own data centres.”

Chandrashekar added that more multivendor solutions will be enabled by decoupled application architectures, common management frameworks, and better interconnection of networks.

Google and Dell team up on enterprise Chromebooks


Bobby Hellard

27 Aug, 2019

Google and Dell are teaming up to take on Microsoft with two enterprise-ready Chromebooks, according to reports.

Dell is launching Chrome OS takes on a pair of its popular business-focused laptops in the form of the Latitude 5400 Chromebook Enterprise and the Latitude 5300 2-in-1 Chromebook Enterprise.

Both of these computers will be the first machines to fall under Google’s new Chromebook Enterprise line, which will see the search giant and partner hardware makers keenly target Chromebooks at business use. While Chromebooks aren’t unknown to the business world, they haven’t taken the market by storm, with Microsoft dominating in the enterprise arena and Chromebooks finding more use in the education sector. 

But Google hopes to challenge Microsoft in a more comprehensive manner with the Chromebook Enterprise line, with Dell helping lead the charge. 

“Chromebook Enterprise is a game-changer for businesses looking for a modern OS that provides end-users with speed and productivity while offering IT the comprehensive security they need,” said John Solomon, vice president of Chrome OS at Google. “As a longtime global leader in the enterprise, Dell Technologies has a deep understanding of end-user and IT needs and is a natural fit to bring powerful devices with the benefits of Chrome Enterprise to businesses worldwide.”

“IT administrators want to give users choice when it comes to OS, device, and when and where work gets done, but they struggle with the growing number of unmanaged devices in their environments,” said Jay Parker, president of the Client Product Group at Dell. “By adding Chrome to Dell Technologies Unified Workspace, we’re giving IT the power to offer a consistent and secure experience for everyone, no matter the OS they choose. And best of all, users get the flexibility to choose the devices and use cases that fit their needs.”

For the two Latitudes models, Dell will bundle in its cloud-based support services, which allow admins to have greater control over how these Chromebooks are deployed within their business. This should help IT admins integrate the Chromebooks into existing Windows environments and manage them through tools like VMware Workspace One.

The Latitude 5400 will have a 14in screen and start at £449, while the 13inch the 5300 13inch 2-in-1 has a starting price of £699. Both can be configured with Intel’s 8th Gen Core i7 processors, up to 32GB of RAM.

Oracle to appeal “unlawful” decision on JEDI contract lawsuit


Dale Walker

27 Aug, 2019

Oracle said it plans to appeal a recent court decision that saw the dismissal of its challenge against the US’ JEDI cloud contract, the company confirmed on Monday.

Oracle has consistently argued that the procurement process of the Joint Enterprise Defence Infrastructure (JEDI) contract, awarded by the US Department of Defence, contravened federal laws and unfairly favoured AWS over other providers.

The company filed a lawsuit against the DoD in December last year, arguing that there were conflicts of interest between former Pentagon and AWS employees. Before a ruling was made on that lawsuit, Oracle was removed from the bidding process in April when it failed to meet the requirement of having three data centres with FedRAMP Moderate ‘Authorised’ support.

The Federal Claims Court ruled in July that because Oracle was unable to qualify for the bid criteria, it lacked the legal standing to challenge the procurement process and therefore dismissed its lawsuit.

Oracle believes the latest dismissal fails to address federal laws that prohibit the awarding of contracts to a single provider.

“Federal procurement laws specifically bar single award procurements such as JEDI absent satisfying specific, mandatory requirements, and the Court in its opinion clearly found DoD did not satisfy these requirements,” said Dorian Daley, general counsel for Oracle.

“The opinion also acknowledges that the procurement suffers from many significant conflicts of interest. These conflicts violate the law and undermine the public trust. As a threshold matter, we believe that the determination of no standing is wrong as a matter of law, and the very analysis in the opinion compels a determination that the procurement was unlawful on several grounds.”

JEDI, a contract said to be worth up to $10 billion, will see the winning bidder take charge of hosting and distributing DoD workloads, including those related to classified military operations. Currently, Microsoft and AWS are the only providers being considered for the contract – Google dropped out of the running early after an employee protest claimed the deal would contravene company ethics.

Earlier this month the DoD announced it would suspend the awarding of the contract while it investigates allegations of bias towards AWS.

How does privileged access security work on AWS and other public clouds?

Bottom line: Amazon’s Identity and Access Management (IAM) centralises identity roles, policies and Config Rules yet doesn’t go far enough to provide a Zero Trust-based approach to Privileged Access Management (PAM) that enterprises need today.

AWS provides a baseline level of support for Identity and Access Management at no charge as part of their AWS instances, as do other public cloud providers. Designed to provide customers with the essentials to support IAM, the free version often doesn’t go far enough to support PAM at the enterprise level. To AWS’s credit, they continue to invest in IAM features while fine-tuning how Config Rules in their IAM can create alerts using AWS Lambda. AWS’s native IAM can also integrate at the API level to HR systems and corporate directories, and suspend users who violate access privileges.

In short, native IAM capabilities offered by AWS, Microsoft Azure, Google Cloud, and more provides enough functionality to help an organisation get up and running to control access in their respective homogeneous cloud environments. Often they lack the scale to fully address the more challenging, complex areas of IAM and PAM in hybrid or multi-cloud environments.

The truth about privileged access security on cloud providers like AWS

The essence of the Shared Responsibility Model is assigning responsibility for the security of the cloud itself including the infrastructure, hardware, software, and facilities to AWS and assign the securing of operating systems, platforms, and data to customers. The AWS version of the Shared Responsibility Model, shown below, illustrates how Amazon has defined securing the data itself, management of the platform, applications and how they’re accessed, and various configurations as the customers’ responsibility:

AWS provides basic IAM support that protects its customers against privileged credential abuse in a homogenous AWS-only environment. Forrester estimates that 80% of data breaches involve compromised privileged credentials, and a recent survey by Centrify found that 74% of all breaches involved privileged access abuse.

The following are the four truths about privileged access security on AWS (and, generally, other public cloud providers):

Customers of AWS and other public cloud providers should not fall for the myth that cloud service providers can completely protect their customised and highly individualised cloud instances

As the Shared Responsibility Model above illustrates, AWS secures the core areas of their cloud platform, including infrastructure and hosting services. AWS customers are responsible for securing operating systems, platforms, and data and most importantly, privileged access credentials.

Organisations need to consider the Shared Responsibility Model the starting point on creating an enterprise-wide security strategy with a Zero Trust Security framework being the long-term goal. AWS’s IAM is an interim solution to the long-term challenge of achieving Zero Trust Privilege across an enterprise ecosystem that is going to become more hybrid or multi-cloud as time goes on.

Despite what many AWS integrators say, adopting a new cloud platform doesn’t require a new Privileged Access Security model

Many organisations who have adopted AWS and other cloud platforms are using the same Privileged Access Security Model they have in place for their existing on-premises systems. The truth is the same Privileged Access Security Model can be used for on-premises and IaaS implementations.

Even AWS itself has stated that conventional security and compliance concepts still apply in the cloud. For an overview of the most valuable best practices for securing AWS instances, please see my previous post, 6 Best Practices For Increasing Security In AWS In A Zero Trust World.

Hybrid cloud architectures that include AWS instances don’t need an entirely new identity infrastructure and can rely on advanced technologies, including Multi-Directory Brokering

Creating duplicate identities increases cost, risk, and overhead and the burden of requiring additional licenses. Existing directories (such as Active Directory) can be extended through various deployment options, each with their strengths and weaknesses. Centrify, for example, offers Multi-Directory Brokering to use whatever preferred directory already exists in an organisation to authenticate users in hybrid and multi-cloud environments.

And while AWS provides key pairs for access to Amazon Elastic Compute Cloud (Amazon EC2) instances, their security best practices recommend a holistic approach should be used across on-premises and multi-cloud environments, including Active Directory or LDAP in the security architecture.

It’s possible to scale existing Privileged Access Management systems in use for on-premises systems today to hybrid cloud platforms that include AWS, Google Cloud, Microsoft Azure, and other platforms

There’s a tendency on the part of system integrators specialising in cloud security to oversell cloud service providers’ native IAM and PAM capabilities, saying that a hybrid cloud strategy requires separate systems. Look for system integrators and experienced security solutions providers who can use a common security model already in place to move workloads to new AWS instances.

Conclusion

The truth is that Identity and Access Management solutions built into public cloud offerings such as AWS, Microsoft Azure, and Google Cloud are stop-gap solutions to a long-term security challenge many organisations are facing today. Instead of relying only on a public cloud provider’s IAM and security solutions, every organisation’s cloud security goals need to include a holistic approach to identity and access management and not create silos for each cloud environment they are using.

While AWS continues to invest in their IAM solution, organisations need to prioritise protecting their privileged access credentials – the “keys to the kingdom” – that if ever compromised would allow hackers to walk in the front door of the most valuable systems an organisation has. The four truths defined in this article are essential for building a Zero Trust roadmap for any organisation that will scale with them as they grow.

By taking a “never trust, always verify, enforce least privilege” strategy when it comes to their hybrid- and multi-cloud strategies, organisations can alleviate costly breaches that harm the long-term operations of any business.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

How to build an effective marketing strategy


Adam Shepherd

27 Aug, 2019

In the internet age, digital marketing is an essential part of any business, but it can be hard to keep up with the ever-changing landscape of social media, email marketing and online ads.

Thankfully, there is a huge range of tools you can call on to help build and execute an effective digital marketing strategy.

First things first – don’t be lured into the trap of thinking that digital marketing is just something that you can pass off to an intern, or that it’s as simple as putting out the odd Facebook post. In reality, it’s a full-time job, and one that’s more complicated than you might think.

The first step is to nail down who your target audience is. This can include demographic data such as age, location, job title and industry, as well as their broad interests. This will inform which marketing channels you focus your energies on, as well as the best marketing tactics to use. Ads for fitness gear, for example, are much more likely to be successful on leisure and lifestyle platforms like Instagram and Facebook, rather than a business network like LinkedIn.

You should also work out what your overall goal is. It may be as simple as increasing general brand awareness, but it could also include things like driving product sales, increasing visits to your website and boosting event attendance. Different outcomes will require different strategies.

Planning your content

Planning marketing activity in advance is essential, whether it be social media posts, email blasts or anything else. Not only does this allow you to structure your activity around business events (such as promotional offers or seasonal trends) it also means you can schedule your activity in bulk, so you don’t have to worry about it. You should map out when you’re going to put out posts and emails, as well as deciding on which copy and images you’re going to use beforehand.

Trello, a project management tool based on the Kanban system, is one of the best tools for planning marketing activity, as it includes a calendar view, as well as high levels of customisation and robust filtering systems. For teams that aren’t planning to put out high volumes of marketing material, Google Calendar can also be a useful asset for tracking when it’s scheduled to go out, although its comparative lack of filtration options make it less suited to organisations with large needs.

Your content plan should form the basis of your marketing strategy. Although you can respond to trends, current events and spikes in interest on-the-fly, you should take care to follow your content plan as well. Otherwise, your marketing activity runs the risk of becoming unfocused and scattershot, which will make it less effective.

Publishing your posts

Once you’ve planned when you want to do your marketing activity, you’ll need to actually schedule and launch it. There are a variety of tools you can use to do this, each with different advantages and specialisations. For social media, Buffer Publish works well as a one-stop-shop, allowing you to schedule posts on Facebook, Instagram, Twitter, Pinterest and LinkedIn.

Buffer does have some disadvantages, such as the fact that it doesn’t always support the platform’s full feature-set. For example, while you can attach an image, video or link to a Facebook post via Buffer, you can’t use it to check into locations, publish polls or tag branded content. Similarly, you can’t use Buffer to publish documents on LinkedIn.

Tweetdeck is worth consideration – once independent but now wholly owned by Twitter, it’s among the most useful and fully-featured tools for managing multiple Twitter accounts and allows you to create custom dashboards with columns for tracking mentions, hashtags, direct messages and more.

Email marketing may be a venerable practise, but its value and importance shouldn’t be overlooked. It can be an excellent way to alert customers to news and offers, and for scheduling and automating emails, Mailchimp is one of the best tools around. You can create a variety of email campaigns, including automated emails for product retargeting, recommendations and abandoned carts. It also includes social media features, although currently only Facebook and Instagram are supported.

These tools can help you execute your content plans quickly and effectively, automating the sending process and freeing you up for other tasks. Your content plan should be used as the basis from which you populate these tools, as it provides a consistent schedule and allows you to build a consistent “brand voice”.

Analysing the impact

A fundamental (and often overlooked) part of digital marketing is the process of periodically analysing and refining the effectiveness of your strategy. It’s no good spending hours crafting social posts and emails if they aren’t offering any real-world benefit, but the only way you’ll know if your strategy is achieving your goals is if you take the time to analyse the results.

The best tools for doing so depend on your primary marketing channels. Facebook, Twitter and LinkedIn all have inbuilt analytics dashboards, which offer the largest and most granular datasets regarding the performance of your social media activity, but they can only show you what actions people have taken on that platform. That’s fine if all you’re interested in is brand awareness, but if your KPIs are based around metrics like conversion rates (such as people completing purchases or signing up for events and newsletters), you’ll need additional ways of measuring them.

If your goal is to drive people to your website, Google Analytics is the gold standard for identifying and categorising the behaviour of visitors to your site. Not only can you track how many people visited your site, what pages they visited and how long they spent on them, you can also see how they arrived on your site.

Google Analytics can show you what proportion of your visitors came from LinkedIn, Twitter, Reddit, Facebook, email newsletters, and so on. This will allow you to focus your social media strategy on the right channels, either focusing your attention on the ones which are most successful, or investing effort in increasing your presence on the channels which have the most room for improvement.

Be careful though: While Google Analytics can segment traffic by source, it can’t identify which traffic has come from posts on your account versus other accounts on the same network. For example, if your Facebook page posts a link to your blog post on cloud storage, and a large number of fans independently post the same link without having seen your original Facebook post, Google Analytics will lump any clickthroughs from any of those posts together under the same banner.

Whichever analysis tools you use, the key is to regularly audit how well your planned social activity has performed, and then use the data from that to inform the planning of the next phase – at which point, the whole process starts all over again.

Automation

All of the tools discussed above work exceedingly well together, and should be used in conjunction with each other to build a marketing tech stack which makes creating, executing and maintaining effective marketing campaigns much simpler than doing everything on-the-fly. However, these tools can be made even more powerful by using workflow automation tools to integrate them all together.

Various products exist to perform this function, but few are as accessible, as powerful or as widely supported as Zapier. This easy-to-use but surprisingly deep tool lets you use rules-based conditional logic to construct powerful integrations between different applications. For instance, you can create a rule that means when an event is added to a specific Google Calendar or a card is added to a specific list in Trello, Buffer will automatically create a social post based on the details within that card or event. You could also create an integration with MailChimp which posts to Buffer any time a new subscriber joins your mailing list.

When used correctly, automation can be the most effective tool in a digital marketer’s arsenal, helping you to create seamless marketing workflows that semi-autonomously build rich, multi-channel campaigns with minimal oversight required.

CRM, lead generation and paid ads

While it can be used simply for increasing brand awareness, marketing is often intended to directly fuel sales. For businesses that want to deepen the links between their marketing and sales operations, customer relationship management (CRM) software is an excellent way to increase collaboration between the two functions. CRM systems allow you to keep track of data on specific customers, be they companies or individuals, including contact details recent purchases and what they’re interested in.

Using a CRM system allows you to send previous customers emails and offers for similar products, and gives your sales team a way to track which marketing materials customers have previously engaged with. Numerous cloud-based CRMs are available, including big names like Salesforce and Hubspot, as well as less well-known providers like Zoho and SugarCRM.

CRM systems also work very well with lead generation – a branch of digital marketing which involves providing customers with a resource, such as a free trial of a service, early access to a product or service, or even a technical whitepaper, in exchange for their contact details. These details can then be used to target these customers with pitches or promotions for products they may be interested in, assuming they’ve explicitly consented to being contacted for this purpose.

You can capture these leads in a number of ways – many CRM platforms include tools for building lead capture forms or integrating with third-party form providers. Facebook and LinkedIn, meanwhile, both include built-in lead capture capabilities as part of their advertising toolkits, allowing you to quickly and easily start gathering new leads.

This kind of paid-for advertising can be an excellent marketing method, too. All of the major social media platforms allow businesses of any size to quickly and cheaply set up ad campaigns on their platforms, pushing them out to a wider number of users than their organic posts would reach. This can be a great way to quickly boost visibility of campaigns, particularly ones that align well with specific social network users.

Outside of social networks, Google also offers advertising across many of its products – most notably YouTube and the core search product. You can pre-set your budget, and will only be charged when customers take specific actions such as clicking through to your site.

Judicious use of paid advertising can be used to highlight the most important elements of your marketing campaigns, used in conjunction with the data from your organic marketing activity to maximise the effectiveness of your key messages.

Digital marketing is a complex field, and one that can seem daunting for the uninitiated; there’s a wealth of different tools, strategies and resources out there. You don’t have to use all of the approaches outlined above – it’s perfectly acceptable to start with one or two and then build out your capabilities as you go – but by integrating your planning, publishing, tracking and analysis tools into a unified marketing tech stack, you can increase the ease, efficiency and effectiveness of your digital marketing capabilities, boosting your activity and creating more value for your organisation.

VMware doubles down on Kubernetes and hybrid cloud


Adam Shepherd

26 Aug, 2019

As part of its annual VMworld conference, software giant VMware has today announced a number of new products aimed at strengthening its position within hybrid cloud and Kubernetes environments, as well as a broad range of updates across its Workspace ONE portfolio.

The main focus was a new portfolio of products dubbed VMware Tanzu, designed to help organisations marry their VMware and Kubernetes deployments. As part of the announcement, the company previewed two of the products which will come under the Tanzu umbrella: Tanzu Mission Control and Project Pacific.

Acting as a single control pane for all an organisation’s Kubernetes clusters, Tanzu Mission Control allows IT teams to manage clusters across public clouds, managed services, vSphere and more, using the same tools that they use to manage their VMs. It will enable one-click policy management for operators, and will feature broad integrations with the rest of VMware’s portfolio to provide tools such as visibility and health monitoring.

The other major announcement was Project Pacific – a new product which will be part of Tanzu when it’s officially launched, and which VMware global field and industry CTO Chris Wolf described as “the most significant innovation that’s come out of our VMware vSphere product line in the last 10 years”.

In essence, Project Pacific is an effort to re-architect VMware’s vSphere management software into a Kubernetes-native platform where organisations can manage both VMs and Kubernetes containers via one control plane. Pacific will also introduce a container runtime to the hypervisor, as well as ESXi native pods designed to blend the best elements of VMs and Kubernetes pods.

“Today, I’m excited to announce Project Pacific: an effort to embed Kubernetes directly into vSphere,” VMware CEO Pat Gelsinger told attendees. “Project Pacific unites vSphere and Kubernetes, and thanks to developers and operators, Project Pacific establishes vSphere as the platform for modern applications.”

“We’re building Kubernetes deep into vSphere, and along the way, we’re actually making vSphere a much better place to run Kubernetes,” added Joe Beda, Kubernetes co-creator and VMware principal engineer. “Operations people have one platform where they can manage all of their resources, including virtual machine and Kubernetes resources. And then application teams will get a self service experience built on proven Kubernetes API apps.”

In fact, Beda revealed, workloads running on ESXI with Project Pacific were capable of running up to 8% faster than on bare metal servers.

While Tanzu and Project Pacific are still some way from general availability with no firm release in sight, VMware advised customers wanting to prepare themselves for the new capabilities to start by adopting PKS, its pre-existing Kubernetes deployment tool. Beda stated that PKS is acting as the on-ramp for Project Pacific, and Wolf noted that “this is your glimpse of the future of PKS”. As Tanzu and Pacific are still only technical previews, there is currently no information on when they might actually be commercially released.

Another major announcement made at the show is the launch of VMware’s CloudHealth Hybrid. Based on last year’s acquisition of CloudHealth Technologies, the new tool expands VMware’s CloudHealth monitoring product to cover not just public cloud deployments but also on-prem and hybrid VMware deployments. Customers will be able to set resource usage policies which trigger automatic alerts when violated, as well as generating robust TCO reports for their full cloud environment. The company’s Wavefront observability tool has seen updates too, with aesthetic tweaks, new alert-driven automations and improved Kubernetes monitoring.

The launch of CloudHealth Hybrid is one of a number of announcements and updates around VMware’s hybrid cloud strategy. The company has unveiled vRealize Operations 8.0, which focuses in large part on applying machine learning algorithms and real-time analytics, in order to detect issues and optimise performance in real-time with minimal human intervention. On top of this, the company is previewing a new cloud-based version of this software, the vRealize Operations Cloud, for its VMware Cloud on AWS customers. Capacity and cost management will also be a major focus, with new planning and optimisation tools across a range of different environments.

Complementary to this is the announcement of another tech preview, codenamed ‘Project Magna’. This is another effort to use AI and machine learning to drive data centre automation, aimed specifically at vSAN. Magna is a self-managing SaaS product which uses real-time data to optimise the read and/or write performance of customers’ vSAN deployments.

In addition to this, VMware’s vRealize Automation 8.0 software also brings more improved governance, cloud-agnostic blueprints for service modelling and integrations with a number of DevOps tools. As with vRealize Operations, a SaaS-based version of this software is also being previewed, and the on-premise vRealise Suite 2019 has also been announced, bringing its core components up to the latest version. All of the new vRealise tools, as well as CloudHealth Hybrid, will be available by the end of VMware’s third financial quarter, which falls on 1 November this year. As with Project Pacific, no firm timeline has been given for the availability of Project Magna.

VMware Cloud on Dell EMC, the company’s data centre as a service offering co-engineered with its parent company, has also gone into general availability. Sadly though, this is only for US customers – there’s no firm date for availability in other territories as yet. The company did, however, announce that Equinix will be operating as a hosting partner for the service, allowing customers to take advantage of it through their facilities.

Public cloud was not ignored, however. Following its initial reveal by Gelsinger, Michael Dell and Microsoft CEO Satya Nadella at Dell Technologies World earlier in the year, VMware also announced the availability of native VMware support on Microsoft Azure. The service is available in Western Europe and the US as of today, with territories in Australia and South following by the end of 2019 and additional regions including northern Europe by the end of Q1 next year.

AWS’ status as a VMware ‘preferred partner’ was in little doubt, and the company has announced a raft of new technologies to support customers deploying on Amazon’s cloud. These include new tools courtesy of an expanded partnership with Nvidia. The two companies have worked together to bring Nvidia’s GPU virtualisation technology to VMware’s AWS offering, as well as to its core vSphere product. Designed to power AI and data science workloads, the new tools will allow IT to manage these workloads using the same tools as their other applications, as well as integrating them with other VMware and AWS products.

In addition, the company announced that as of today, VMware Cloud on AWS customers will be able to take advantage of a new Cloud Migration tool. This tool, as well as other cloud workflows, are expected to come to VMware Cloud on Dell EMC and VMware Cloud on AWS Outposts in the future. A disaster recovery as a service offering, built with Dell EMC and AWS S3, will also be introduced for VMware Cloud on AWS by the end of October. These customers can also now take advantage of VMware’s Bitnami-powered Cloud Marketplace, as can the company’s Cloud Provider Partners.

Moving onto VMware’s desktop virtualisation business, the conference has seen a number of product updates and new features announced for Workspace ONE. The most attention-grabbing of these is a new digital assistant, powered by IBM Watson and designed to speed up the onboarding process for new employees, as well as assist with simple level one support tasks like wireless troubleshooting or ticket creation.

It also unveiled a new Workspace ONE Intelligence service called Digital Employee Experience Management, designed to help IT automatically detect and resolve problems with employee endpoints through real-time data analysis, which is currently in preview. In order to aid with remediation efforts, Workspace ONE Assist (previously called Workspace ONE Advanced Remote Management) can now support Windows and macOS devices, in addition to Android, iOS, and Windows CE.

In addition to this, VMware is increasing Workspace ONE’s device support across the board; support for iOS 13 and iPadOS support is planned for later in the Autumn, augmented by additional management capabilities for Apple devices, Google Android and ChromeOS devices will benefit from new monitoring and migration tools, and admins can now use a new AirLift migration tool to onboard Windows 10 devices with SCCM collections and GPOs intact.

Finally, VMware Horizon also benefits from new capabilities. VMware Horizon Services for Multi-Cloud brings automatic management tools which allow employees to log into the most suitable virtual workspace, whether it’s hosted in an on-premise or cloud-based environment. New management services will also enable the use of one-to-many package deployment tools, and will surface Horizon data for the purposes of performance management. For customers that are still using a pre-existing perpetual license for Horizon 7, VMware has now introduced the VMware Subscription Upgrade Program for Horizon, which the company promises will allow customers to upgrade to Horizon Universal Licenses “at a price reflecting the original value”.

VMworld 2019: Going big on Kubernetes, Azure availability – and a key ethical message

VMware has kicked off its 2019 VMworld US jamboree in San Francisco with a series of updates, spanning Kubernetes, Azure, security and more.

The virtualisation and end user computing giant issued no fewer than five press releases to the wires alone today, with CEO Pat Gelsinger and COO Sanjay Poonen continuing to emphasise the company's 'any cloud, any application, any device, with intrinsic security' strategy.

Chief of these on the product side was around VMware Tanzu, a new product portfolio which aims to enable enterprise-class building, running, and management of software on Kubernetes. Included in this is Project Pacific, which is an ongoing mission to rearchitect server virtualisation behemoth vSphere with the container orchestration tool. 

Except to call Kubernetes a container orchestration tool would be doing it a major disservice, according to Gelsinger. Not since Java and the rise of virtual machines has there been a technology as critical for cloud since Kubernetes, he noted, in connecting developers and IT ops. This is evidently the goal of Project Pacific as well. Tanzu, we were told, is both Swahili for branch – as in a new branch of innovation – and the Japanese word for container.

Other product news included an update to collaboration program Workspace ONE, including an AI-powered virtual assistant, as well as the launch of CloudHealth Hybrid by VMware. The latter, built on cloud cost management tool CloudHealth, aims to help organisations save costs across an entire multi-cloud landscape and will be available by the end of Q3. 

Analysis: Pick a cloud, any cloud

VMware's announcement of an extended partnership with Google Cloud earlier this month led this publication to consider the company's positioning amid the hyperscalers. VMware Cloud on AWS continues to gain traction – Gelsinger said Outposts, the hybrid tool announced at re:Invent last year, is being delivered upon – and the company also has partnerships in place with IBM and Alibaba Cloud.

Today, it was announced that VMware in Microsoft Azure was generally available, with the facility gradually being switched on across Azure data centres. By the first quarter of 2020, the plan is for availability across nine global areas.

The company's decision not to compete, but collaborate with the biggest public clouds is one that has paid off. Yet Gelsinger admitted that the company may have contributed to some confusion over what hybrid cloud and multi-cloud truly meant. The answer (below) was interesting. 

Increasingly, with organisations opting for different clouds for different workloads, and changing environments, Gelsinger honed in on a frequent customer pain point for those nearer the start of their journeys. Do they migrate their applications or do they modernise? Increasingly, customers want both – the hybrid option. "We believe we have a unique opportunity for both of these," he said. "Moving to the hybrid cloud enables live migration, no downtime, no refactoring… this is the path to deliver cloud migration and cloud modernisation."

As far as multi-cloud was concerned, Gelsinger argued: "We believe technologists who master the multi-cloud generation will own it for the next decade."

Customers, acquisitions, and partnerships

There were interesting customer stories afoot – particularly down to the scale and timeframe of their initiatives. FedEx has its fingers in many pies, from VMware Cloud on AWS, to VMware Cloud Foundation, to Pivotal – on whom more shortly. 

Research firm IHS Markit was what Gelsinger called "a tremendous example of a hybrid cloud customer." The company's goal was to have 80% of its applications in the public cloud, with the result of being able to migrate 1000 applications in six weeks. Poonen asked Freddie Mac about how many of its 600 apps the financier was migrating. The answer: all of them, bar a negligible few. The initiative started in February, and the plan was to be finished 'by Thanksgiving.'

On the partnership side, VMware announced a collaboration with NVIDIA for the latter to deliver accelerated GPU services for VMware Cloud on AWS. This was again cited as key to enterprise customers to make the most of their huge swathes of data, while the move also links in with VMware's acquisition of Bitfusion, enabling the company to efficiently make GPU capabilities available for AI and ML workloads in the enterprise.

Gelsinger made special note to mention VMware's most recent acquisitions, with Pivotal and Carbon Black being name-checked at the front of the keynote and brought on to discuss where they fit in the VMware stack at the back.

Analysis: Gelsinger's irresistible take on tech expansion

Pat Gelsinger is rapidly turning into a must-listen speaker with regards to the future of technology, its convergence, and its ethical effects. 

This is not to say previous VMworld talks aren't worth your time, of course. Gelsinger has done seven big US events now – seven and a half if you include 2012 when former CEO Paul Maritz was handing over the reins. Yet today he sits in a fascinating position across cloud, network and more, to assess the landscape.  

The opening 2019 VMworld keynote had everything one would expect; customer success stories by the cartload and product announcements by the pound, as seen above. Yet underpinning it was an ethical fix. "As technologists, we can't afford to think of technology as someone else's problem," Gelsinger told attendees, adding VMware puts 'tremendous energy into shaping tech as a force for good.'

Gelsinger cited three benefits of technology which ended up opening Pandora's Box. Free apps and services led to severely altered privacy expectations; ubiquitous online communities led to a crisis in misinformation; while the promise of blockchain has led to illicit uses of cryptocurrencies. "Bitcoin today is not okay, but the underlying technology is extremely powerful," said Gelsinger, who has previously gone on record regarding the detrimental environmental impact of crypto.

This prism of engineering for good, alongside good engineering, can be seen in how emerging technologies are being utilised. With edge, AI and 5G, and cloud as the "foundation… we're about to redefine the application experience," as the VMware CEO put it. 

2018's keynote was where Gelsinger propagated the theme of tech 'superpowers'. Cloud, mobile, AI, and edge were all good on their own, but as a cyclical series could make each other better. This time, more focus was given to how the edge was developing. Whether it was a thin edge, containing a few devices and an SD-WAN connection, a thick edge of a remote data centre with NFV, or something in between, VMware aims to have it all covered.

"Telcos will play a bigger role in the cloud universe than ever before," said Gelsinger, referring to the rise of 5G. "The shift from hardware to software [in telco] is a great opportunity for US industry to step in and play a great role in the development of 5G." 

Reaction: Collaboration between VMware and the hyperscalers – but for how long?

If VMware was at all concerned about the impact of its AWS partnership, the company needn't have worried. One other piece of news was a research study put together alongside Forrester around the total economic impact of VMware Cloud on AWS. 

The study saw Forrester put together a composite organisation, based on companies interviewed, and assessed budgets and migration considerations. The average organisation had 80 servers, $2 million in annual software budgets and a 40 to one ratio of VMs to applications. The composite organisation saved 59% of operational costs in the cloud compared with equivalent capacity on-premises.

Speaking directly after the keynote Bruce Milne, CMO at hyperconverged infrastructure provider Pivot3, praised the strategic approach – but with something of a caveat.

"Pat Gelsinger in his keynote illustrated that [VMware] is not content to rest on its laurels and continue to push the boundaries of technology through development, acquisition and partnerships," Milne told CloudTech. "Software is eating the world, as the old maxim goes, and that was evident through his discussion. He challenged customers to look at the economic benefit of refreshing their hardware and saving money through virtualisation with NSX, [and] declared that VMware is serious about integrating Kubernetes into vSphere, which empowers app developers who may be developing for the cloud or on-prem.

"There's an obvious strategic tension in VMware's collaboration with the hyperscale cloud providers, but for now it appears they've agreed to a collaborative detente," added Milne. "Watch this space because that friction is sure to generate sparks eventually. VMware wants to be the facilitating platform for apps on 'any cloud' – clearly a space that the hyperscale vendors covet as well."

You can take a look at the full list of VMworld 2019 announcements here.

Postscript: A keenly raised eyebrow from this reporter came when Gelsinger referred to William Hill, an NSX customer, as an 'online gaming company'. UK readers in particular will note that this isn't quite telling the whole truth – but let's just say it emphasises the difference between holding your events in San Francisco and Las Vegas.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Cloud performance and change management cited in latest DORA DevOps analysis

The performance figures for those at the sharp end of DevOps implementation goes up and up – and cloud usage is increasingly becoming a key differentiator.

That’s the primary finding from the DORA Accelerate State of DevOps 2019 report. The study, which was put together alongside Google Cloud, is a major piece of work, covering responses from more than 31,000 survey respondents tracking six years of research.

The proportion of highest performing users has almost tripled according to the research, comprising almost 20% of all times. What’s more, the highest performing teams were 24 times more likely than lower performers to be cloud specialists. This was based on executing across five specialisms of cloud computing based on NIST guidelines; on-demand self-service, broad network access, resource pooling, rapid elasticity, and measured service.

One aspect which needed to be considered was around change management. Perhaps not surprisingly, formal change management processes – for instance requiring approval from an external Change Approval Board – has led to a downturn in software delivery performance.

“Heavyweight change approval processes negatively impact speed and stability,” the report notes. “In contrast, having a clearly understood process for changes drives speed and stability, as well as reductions in burnout.”

The report further argues that formal change approval processes – industry norms for years – do not aid stability. Research investigated whether a more formal approval process – in other words, fewer releases – was associated with lower change fail rates. Organisations naturally want to introduce additional processes if problems are encountered with software releases; yet this is a red herring, the report argues.

“Instead, organisations should shift left to peer review-based approval during the development process,” the report explains. “In addition to peer review, automation can be leveraged to detect, prevent, and correct bad changes much earlier in the delivery lifecycle. Techniques such as continuous testing, continuous integration, and comprehensive monitoring and observability provide early and automated detection, visibility, and fast feedback.”

Ultimately, the research gives a simple conclusion: slow and steady wins the race. “One thing is clear: a DevOps transformation is not a passive phenomenon,” the report notes. “No profiles report strong use of a ‘big band’ strategy – though low performers use this the most often – and that’s probably for the best.

“In our experience, this is an incredibly difficult model to execute and should only be attempted in the most dire of situations, when a ‘full reset’ is needed,” the report adds. “In the ‘big band’, everyone needs to be on board for the long haul, with resources dedicated for a multi-year journey. This may explain why this method is seen most often among our low performers.”

You can read the full report here (email required).

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

VMware stokes VMworld fires with Pivotal and Carbon Black acquisitions

Next week is going to be a busy one for VMware as the company’s VMworld event kicks off in San Francisco. There will be plenty of talking points – and the virtualisation and end user computing provider has opted to add two more with the acquisitions of Pivotal Software and Carbon Black for a combined $4.8 billion (£3.93bn).

Pivotal, at $2.7bn, edged out endpoint security provider Carbon Black ($2.1bn) in terms of price, but the rationale for both companies makes sense. Pivotal will fit in to the company’s vision around application development, Kubernetes and multi-cloud, as Paul Fazzone, VMware SVP/GM cloud native apps explained.

“We know Pivotal, and Pivotal knows VMware. VMware and Pivotal share a commitment to the community – collectively we’re a force across Kubernetes, Cloud Foundry and developer technologies,” Fazzone wrote in a blog post. “We share vision – we offer the most complete approach to application modernisation on any cloud.”

This was a view echoed by Pivotal CEO Rob Mee in a letter to employees. “We’ll expand the opportunities for both companies by unifying our software story and deepening our relationships with even more customers,” wrote Mee. “Together we will form an organisation that combines Pivotal’s expertise modernising organisations with VMware’s capabilities and experience operating at scale.”

As far as Carbon Black is concerned, VMware sees the security provider as a more-than-useful foil for its overall goal to be the ‘digital foundation for any cloud, app and device’ as the company puts it. The two firms had previously been partners around application security product AppDefense.

“VMware believes that security is in dire need of transformation,” wrote Tom Corn, SVP/GM security products in a blog. “It needs to shift from a bolted-on model with thousands of point products, agents and appliances, all focused on different points of infrastructure – to a built-in model where the technology is embedded into the cloud and mobile fabric, and security becomes a distributed service rather than point tools.”

The company’s biggest acquisition of 2019 thus far was that of Bitnami, provider of application packaging and delivery services, back in May. VMware otherwise has been focusing a fair part of its strategy on being a facilitator of hyperscale cloud providers. VMware Cloud on AWS is well-noted, but earlier this month the company extended its deal with Google Cloud.

VMworld US takes place from August 25-29. Take a look at CloudTech’s coverage of the event by visiting the VMware page here.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

What to expect from VMworld US 2019


Adam Shepherd

23 Aug, 2019

Time, as the saying goes, makes fools of us all. When I first sat down to write this column on Wednesday afternoon, it had been less than a week since VMware announced its intention to purchase fellow Dell Technologies brand Pivotal. While the news would doubtless be on everyone’s mind, I assumed we wouldn’t actually hear anything about it at this year’s VMworld, and I structured my predictions accordingly. After all, I thought, surely they’re not likely to close such a big acquisition in less than a fortnight.

Well, you know what they say about assumptions.

In what I suspect is a move designed specifically to make my life more difficult, CEO Pat Gelsinger has just confirmed that not only is VMware purchasing Pivotal (for the princely sum of $2.7 billion, no less) it’s also snapping up security firm Carbon Black.

In light of these developments, it doesn’t take a genius to figure out what the big themes of this year’s show are going to be. While we probably won’t learn too much specifically about how Gelsinger intends to integrate his new toys into the company – the deals won’t close until the end of January next year – expect an update on VMware’s strategy and roadmap with security and cloud at the centre.

In terms of specific product updates, public cloud is probably going to feature heavily. Last year’s VMworld events (both European and US variants) saw the company announce deeper and more powerful integrations with Amazon Web Services across multiple products and use-cases, which is likely to continue this year.

Dell Technologies World in April also saw VMware and Microsoft surprise delegates by jointly announcing native VMware support on Azure. Expect more news on that front too, as the two companies deepen their partnership. If I had to guess, we’d bet that part of this will be an integration of some kind between Azure Networking and VMware NSX – a product that will probably be the focus of a significant portion of VMware’s show.

To round out the ‘Big Three’ public cloud providers, I wouldn’t be surprised to hear more news about support for VMware on Google Cloud Platform, either. First announced late last month, the combination of VMware Cloud Foundation with GCP gives VMware pretty much comprehensive coverage of the public cloud landscape.

Moving on from public cloud, the general availability of VMware Cloud on Dell EMC – the company’s subscription-based on-premise data centre as a service offering – is likely to be announced at the event, too. Also announced at Dell Technologies World this year, the service is scheduled for the second half of this year, but has yet to be formally rolled out to customers.

Regardless, the stage is set for a packed-to-bursting show. VMware has gone from strength to strength in recent years, cementing itself as the go-to virtualisation provider and proving its technical nous through smart investments in areas like software-defined networking and containerisation. With two more substantial companies being brought into the fold, we’re interested to see where the company goes from here.