All posts by Adam Shepherd

Microsoft, not Amazon, is going to win the cloud wars

Adam Shepherd

12 Dec, 2019

Brace yourselves, because I’m about to share a theory that may be a little unpopular: I believe it’s only a matter of time before Microsoft Azure overtakes AWS as the dominant force in the world of public cloud. 

I know that may sound crazy, and many of you are probably already reaching for the ‘close tab’ button, but hear me out. 

It’s no secret that Bezos’ cloud computing division is currently sitting pretty as market leader, having capitalised incredibly effectively on its first mover advantage while its’ rivals initial efforts stalled. By cementing its reputation as the biggest force in the cloud industry, it has attracted a number of high-profile customers, but it has struggled to make a major splash within large, established enterprises. 

You know who hasn’t, though? Microsoft.

While AWS has always been a favourite of startups and developers, Microsoft has concentrated firmly on the enterprise and met with remarkable success. To sweeten the deal, Microsoft has also been busily releasing a number of business-friendly features, such as its Azure Arc platform, which is designed to make it easier to consume and deploy its services across a large enterprise estate. In fact, any time I’ve spoken to a CIO who hasn’t yet moved to the cloud but is planning to, Azure has been a key part of their roadmap.

The stated reason for this is usually “well, it works with all of our existing systems”, which is a simple yet compelling point; if your on-prem servers are primarily running workloads like Active Directory, SQL Server and Exchange Server instances, opting for Microsoft’s cloud platform is sort of a no-brainer. Add in the fact that most large businesses are likely to be using Microsoft’s Office and Windows software (and even potentially Windows Server) and the logic becomes apparent.

More importantly, however, Microsoft has learned how to play nicely with others. Azure has always been a more open platform than most have given it credit for, but the addition in recent years of full native support for the likes of Linux and VMware show just how far it’s come. It’s making a real effort to be as flexible as possible, allowing customers to run the workloads that they want in the way they want to run them. 

This includes multi-cloud environments, which is the new hotness for businesses that want to avoid vendor lock-in and increase redundancy protection. Microsoft is more than happy to support multi-cloud deployments, if that’s what the customer wants. 

Amazon? Not so much. As we discussed on a recent episode of the IT Pro Podcast, there have been recent reports that suggest that AWS partners are banned from even using the term multi-cloud, presumably on the basis that – as the current top of the pile – giving customers the option of using multiple providers only increases the risk that they’ll ditch AWS for a better option. Note that in that scenario, the emphasis is not so much on giving customers the best possible option but on trying to hide from them the fact that other providers exist.

Amazon is undoubtedly on the cutting edge as far as tech development goes; its pioneering work on machine learningserverless computing and function as a service tools are evidence enough of that. It’s enterprise support that will determine the true winner of the cloud wars, however, and in this area, AWS is leagues behind Microsoft.

Android gets new security sandboxing features

Adam Shepherd

18 Oct, 2019

Google has brought new security features to web users on Android, with the integration of browser sandboxing capabilities to its Chrome app.

As of Chrome version 77, Android users are now protected by ‘Site Isolation’. This sandboxing feature involves isolating each browser tab from the other tabs in the session, and works by ensuring that web pages from different domains are run as separate processes, reducing the risk of side-channel attacks like the Spectre flaw.

This feature has been active on desktop instances of Chrome for some time, and the Android version is somewhat slimmed-down by comparison; in order to reduce performance overheads, Site Isolation is only enabled for password-protected sites, where users may be at risk of having their credentials stolen. This will help lessen the impact of the feature on smartphone speeds, particularly for cheaper devices with less RAM.

On desktop platforms, meanwhile, the existing sandboxing features have also been strengthened. In addition to side-channel attacks, Chrome can now defend against attacks involving a fully-compromised renderer process.

To coincide with this, the company is temporarily expanding its bug bounty programme to offer greater rewards for bugs involving Site Isolation, as well as including cross-site data disclosure attacks that involve compromised renderers.

Sandboxing is a common security measure, and refers to the process of isolating an environment from neighbouring systems in order to prevent the spread of harmful activity. Sandboxed environments are commonly used by researchers to analyse malware activity, as they allow the malware to be studied without risking the security of the rest of the network or operating system.

Pure Storage beefs up cloud support

Adam Shepherd

17 Sep, 2019

Pure Storage has today announced the general availability of new data management tools for Azure and AWS as part of its annual Accelerate conference in Austin, Texas, improving its public cloud support and further strengthening its position in the multi-cloud space.

Starting with AWS, the company has announced that its Cloud Block Store for AWS product, first revealed last year, is now generally available for all customers. The product is a wholly software-based offering, allowing customers to use the company’s Purity management software to manage their AWS storage.

The initial beta version of Cloud Block Store used EC2 compute instances with EBS as a storage layer, but the configuration has since changed. As Pure Storage vice president of strategy Matt Kixmoeller explained, the conclusion was that EBS was not reliable enough for the product’s requirements.

“As we worked closely with Amazon, what we found was that EBS didn’t have the reliability characteristics that a Tier 1 storage array needs,” he said. “In particular, there are challenges around coordinated failures, where multiple volumes can fail at once. And so we completely re-architected the backend layer to run natively on S3. S3 is Amazon’s most durable, most reliable storage tier by far – 11 nines of durability.”

“And so we use EBS as a cache to deliver high performance, but persist data on S3. And if you look at most customers, they really treat S3 as their cloud storage. So this solution becomes a way for us to bring a Tier 1 block experience to use in the Amazon cloud storage S3, that customers are most familiar with, and most trust.”

Part of the goal with the new service is to enable workloads to move seamlessly in both directions; from the cloud to the data centre, as well as from the data centre to the cloud. It uses the same management tools and APIs as Pure’s on-prem management software, as well as featuring the ability to run across two availability zones in active/active configuration.

Cloud Block Store for AWS will be available via the AWS Marketplace on either a month-to-month or a one-year contract. Customers who want something more long term can get contracts ranging from one to three years by purchasing through ‘Pure-as-a-Service’, which is a rebranded version of the company’s Evergreen Storage Service, now effectively acting as a subscription-based consumption program.

The other major cloud announcement was the availability of CloudSnap for Azure, a built-in backup mechanism for FlashArray products which lets the Purity management software seamlessly and transparently move snapshots to the public cloud. CloudSnap was initially launched last year with AWS support, but has now been expanded to Azure as well. This, Kixmoeller said, was an excellent example of Pure’s intentions to extend its tools to a multitude of different cloud providers.

“Our strategy at Pure is to absolutely deliver these services as multi-cloud,” he said. “So Cloud Block Store, we started with Amazon – that’s the natural place to start. But as we see more and more adoption, and that gets more mature, and we will of course proliferate to other clouds.”

“It’s not an easy thing for us to snap our fingers and have it available on all three clouds, because we’re doing the hard work of integrating it deeply. And so this is our first example of bringing something to a second cloud.”

As part of the show, the company also announced a capacity-driven flash-based secondary storage appliance with quad-layer cell memory, as well as a new plug-in DirectMemory module for FlashArray//X appliances offering an instant performance boost.

Now is the time to embrace remote working

Adam Shepherd

6 Sep, 2019

I’ll be honest; it’s been a little hard to concentrate on writing this month’s column. As I write, Boris Johnson and the Conservative party have lost their parliamentary majority, somehow plunging the Brexit situation into even more chaos.

This latest phase of the debacle has got me thinking about what will happen to businesses in the event of a no-deal Brexit. The potential negative impacts have been well-documented, from a shutdown of data transfers with the EU to severe delays on international shipments, but the issue that keeps playing on my mind is the strong possibility of a resulting skills crunch.

A clampdown on immigration from EU countries has been high on the list of hardcore Brexiteers’ priorities, which will likely reduce the pool of skilled tech workers entering the country. Even if European developers and specialists aren’t barred from entering the country following Brexit, and the ones already here not compelled to return home, one could hardly blame them for choosing to take their talents to a more welcoming and less chaotic nation.

A sudden lack of locally-based technology talent is a real possibility that businesses have to confront, but there are ways around it. One is to focus on upskilling or cross-skilling existing staff, but that takes time – time that organisations may not have if the impact of no deal is as sudden as some are predicting.

A better option is to embrace remote working. The fact is that, when it comes to technical roles, there’s very little need for all your staff to work out of a corporate office. Cloud infrastructure platforms and SaaS tools allow companies to manage and administrate the vast majority of their IT remotely if they so choose, and even when it comes to physical infrastructure or hands-on IT support, you only really need a small in-house team to effect physical changes, while off-site employees handle the configuration. This is even more true when it comes to developers and software engineers, who can be based anywhere in the world and still be just as effective at their jobs.

For many businesses, the biggest worry with moving to remote working is making sure staff remain connected with colleagues and managers, and continue to be engaged with the business. It’s easy for remote workers to feel isolated or ostracised if efforts aren’t made to include them, but collaboration platforms like Slack, Microsoft Teams, Dropbox, Skype and Google Hangouts are all great tools for ensuring they still feel like part of the team.

Rolling out these tools can have benefits for employees outside of IT as well, increasing productivity and efficiency, as well as allowing office-based staff to work flexibly if they want. Ensuring new systems are adopted can be a challenge, of course, but the long-term benefits are worth it.

By making use of these technologies, organisations can make sure they can recruit and retain European tech staff in the event of a no-deal Brexit, but time is of the essence. If the walls go up on 31 October and you don’t already have wheels in motion to implement remote working within your business, you’ll be on the back foot compared to rivals that do. You may be tempted to wait and see how things pan out, but let’s be honest – it’s far better to be prepared.

View From the Airport: VMworld US 2019

Adam Shepherd

30 Aug, 2019

I think it’s fairly safe to say that I picked a good year to visit VMworld US for the first time. While I’ve been to its European equivalent, this was the first year I went to the main event and it was something of a doozy.

Not only did we get a nice bit of pre-conference sizzle with the news that VMware is acquiring Carbon Black and Pivotal, but the entire show was also a festival of product updates and previews. More than anything else, it felt like a statement of intent from Gelsinger and his comrades, setting out the company’s stall for the future.

The big focus of the show – and of VMware’s main announcements – was Kubernetes. The company is betting big on the container technology as the future of application development, with plans to weave it into vSphere with Project Pacific, and use Pivotal and Bitnami’s technology to make VMware even more attractive to Kubernetes developers. Virtually every main-stage announcement featured Kubernetes in some capacity, and VMware veteran Ray O’Farrell is being put in charge of getting that side of the business (including the forthcoming Pivotal integrations) running smoothly.

All the new Kubernetes-based products – Project Pacific, Tanzu and the like – are still in tech preview with no release date in sight and, honestly, that’s probably a good thing. I’m really not sure how many of VMware’s customers are ready to start deploying containers at scale. Mind you, making Kubernetes management a core part of VMware’s capabilities may well go a long way towards encouraging adoption.

It feels like a future-proofing measure more than anything else. Gelsinger is a sharp guy and when he says that containers are the future, he’s not wrong. It may not have reached mass adoption yet, but it’s growing fast, which isn’t surprising given the technology’s proven benefits. This isn’t a pivot though; VMs aren’t going anywhere, as Gelsinger himself has been quick to point out. He notes that all the companies operating Kubernetes at scale – Google, Microsoft, Amazon, et cetera – operate them inside VMs. More to the point, it’ll be a long time yet before Kubernetes gets anywhere close to rivalling VMs in terms of the number of production workloads.

Between the new possibilities promised by Project Pacific, the increasing focus on multi-cloud infrastructures and the forthcoming integration of Carbon Black’s technology into the product line, VMware looks like a company at the absolute top of its game, cementing its dominance of the virtualisation market and paving the way for that dominance to continue long into the future. If Gelsinger, O’Farrell and the rest of the team can pull off everything they’ve promised, then customers and admins have a lot to look forward to.

Carbon Black execs reveal post-acquisition plans

Adam Shepherd

29 Aug, 2019

Last week, VMware threw the tech industry a curveball when CEO Pat Gelsinger announced that not only would it be acquiring Pivotal, as had been announced the previous week, it was also snapping up security firm Carbon Black. While the deal isn’t expected to close until the end of January next year, the company has devoted a substantial chunk of this year’s VMworld conference to discussing the acquisition, and what it means for the future of both companies.

For Carbon Black CEO Patrick Morley, the acquisition presents a huge opportunity for the company to expand its capabilities, and he sees a number of areas where being part of VMware can help it protect its customers in new ways.

“Pending close, I think there’s a number of opportunities,” he tells Cloud Pro. “The biggest one’s end user computing. Management and security go hand in hand, so end user computing is a huge opportunity for us.”

A substantial part of this is integration with Workspace ONE, VMware’s desktop virtualisation product. It’s one of four key integrations with its existing portfolio that VMware has already identified as priorities once the deal goes through. Not only does it make sense from a customer use-case perspective, VMware COO Sanjay Poonen pointed out that many Workspace ONE customers are also Carbon Black customers, a fact which supposedly influenced the decision to acquire the company.

While both VMware and Carbon Black executives have indicated that the company intends to keep the Carbon Black brand alive once the deal closes, and there are no immediate plans to shutter any of its services, Gelsinger told Cloud Pro that the goal is eventually to weave Carbon Black’s technology into VMware’s platform rather than offering it via standalone applications.

“The plans are to bring these integrated solutions together,” he says. “You could imagine you’re going to buy your Workspace ONE with Carbon Black. And these just end up being features. The thing is, we don’t want customers to be ‘buying point products for point use cases’ – buy a platform that gives you lots of those benefits.”

“Customers today will have a Tanium agent, Right? And they’ll have a McAfee agent, and they’ll have a Qualys agent. They’ll also have a Workspace ONE agent for management. So I’ve got four agents on the client. I have customers literally, who have 17 agents on every PC. 17 agents. What are you talking about? One was our goal, as we collapse all of those use cases into one underlying agent.”

Don’t wait, integrate

Being owned by VMware will make Carbon Black a de facto part of the Dell Technologies family, which also opens up other avenues for expanding its endpoint protection.

“Obviously, the Dell family is another capability, because Dell increasingly is providing security to its customers, as part of the laptops and other hardware that they’re providing. And so if we can build security right into that, it’s hugely advantageous too,” Morley says. “You will certainly see us work with Dell – again, pending close – to actually give customers the option to be able to put security right onto the machine, if they so choose.”

If Dell’s business laptops come preloaded with a free subscription to Carbon Black’s endpoint detection and response (EDR) service, this could be hugely beneficial for organisations. The more exciting prospect, however, is the potential impact Carbon Black’s technology can have on application security for VMware customers.

“The second piece is the work that’s already been done around app defence, which is actually building security hooks right into vSphere,” Morley explains.

This integration would enable agentless protection of applications running in vSphere, improving both application performance and detection rates. This would be groundbreaking and, if successfully integrated, has the potential to radically improve the security of organisations running vSphere.

Elsewhere, VMware is planning to integrate Carbon Black’s technology into its NSX platform to provide more in-depth network security analytics, as well as partnering it with another recent acquisition – Secure State – to address security configuration challenges. However, while the acquisition will allow Carbon Black to expand into new kinds of protection, the company executives are also extremely excited about its potential to supercharge its existing services.

One of the linchpins of Carbon Black’s technology is the collection and analysis of security data from all of the endpoints that are running its agent. At the moment, that consists of 15 million endpoints, but if Carbon Black’s agent is incorporated into vSphere or Workspace ONE, that total significantly increases overnight.

Room for growth

“We’re super excited to be able to leverage the reach that Dell EMC and VMware bring to the equation here. I mean, there’s 70,000 partners that we’re going to be able to tap into,” says Carbon Black’s senior vice president of corporate and business development Tom Barsi. “That’s really where you’re talking about adding a zero to the number of customers we’re touching.”

In addition to improving its protection capabilities, this increase in footprint and telemetry will give more fuel than ever to Carbon Black’s Threat Analysis Unit (TAU), which conducts research into security trends as well as analysing new and emerging threat actors and attack methodologies. This research, Morley promises, will most certainly continue and will in fact likely expand once the company joins VMware.

Carbon Black’s executives seem to be exceedingly positive about the prospect of joining the VMware family, which should come as no surprise. Carbon Black has been a technically-focused company since its inception – Morley notes that the company was founded by a team of actual hackers – and this emphasis on technology and engineering is at the core of its new owner’s values.

“I’m really excited,” Carbon Black CTO Scott Lundgren told Cloud Pro. “As CTO, it’s pretty amazing to have an opportunity to work with a highly technical leadership team. It starts with Pat. As ex-CTO of Intel, he’s got a great reputation, and he deserves it. He’s fantastically technical, so he understands the problem. He knows what it takes to actually address it, he can act with confidence, because he knows what’s going on under the hood. But it isn’t just Pat, the whole team is deeply technical [and has] a lot of expertise in a wide variety of technical fields across the board. It’s really great to see.”

“We’re going to have some work to do, obviously, to scale up but it’s very tractable, as long as you’ve got the right mindset at the top – and Pat has that.”

How to build an effective marketing strategy

Adam Shepherd

27 Aug, 2019

In the internet age, digital marketing is an essential part of any business, but it can be hard to keep up with the ever-changing landscape of social media, email marketing and online ads.

Thankfully, there is a huge range of tools you can call on to help build and execute an effective digital marketing strategy.

First things first – don’t be lured into the trap of thinking that digital marketing is just something that you can pass off to an intern, or that it’s as simple as putting out the odd Facebook post. In reality, it’s a full-time job, and one that’s more complicated than you might think.

The first step is to nail down who your target audience is. This can include demographic data such as age, location, job title and industry, as well as their broad interests. This will inform which marketing channels you focus your energies on, as well as the best marketing tactics to use. Ads for fitness gear, for example, are much more likely to be successful on leisure and lifestyle platforms like Instagram and Facebook, rather than a business network like LinkedIn.

You should also work out what your overall goal is. It may be as simple as increasing general brand awareness, but it could also include things like driving product sales, increasing visits to your website and boosting event attendance. Different outcomes will require different strategies.

Planning your content

Planning marketing activity in advance is essential, whether it be social media posts, email blasts or anything else. Not only does this allow you to structure your activity around business events (such as promotional offers or seasonal trends) it also means you can schedule your activity in bulk, so you don’t have to worry about it. You should map out when you’re going to put out posts and emails, as well as deciding on which copy and images you’re going to use beforehand.

Trello, a project management tool based on the Kanban system, is one of the best tools for planning marketing activity, as it includes a calendar view, as well as high levels of customisation and robust filtering systems. For teams that aren’t planning to put out high volumes of marketing material, Google Calendar can also be a useful asset for tracking when it’s scheduled to go out, although its comparative lack of filtration options make it less suited to organisations with large needs.

Your content plan should form the basis of your marketing strategy. Although you can respond to trends, current events and spikes in interest on-the-fly, you should take care to follow your content plan as well. Otherwise, your marketing activity runs the risk of becoming unfocused and scattershot, which will make it less effective.

Publishing your posts

Once you’ve planned when you want to do your marketing activity, you’ll need to actually schedule and launch it. There are a variety of tools you can use to do this, each with different advantages and specialisations. For social media, Buffer Publish works well as a one-stop-shop, allowing you to schedule posts on Facebook, Instagram, Twitter, Pinterest and LinkedIn.

Buffer does have some disadvantages, such as the fact that it doesn’t always support the platform’s full feature-set. For example, while you can attach an image, video or link to a Facebook post via Buffer, you can’t use it to check into locations, publish polls or tag branded content. Similarly, you can’t use Buffer to publish documents on LinkedIn.

Tweetdeck is worth consideration – once independent but now wholly owned by Twitter, it’s among the most useful and fully-featured tools for managing multiple Twitter accounts and allows you to create custom dashboards with columns for tracking mentions, hashtags, direct messages and more.

Email marketing may be a venerable practise, but its value and importance shouldn’t be overlooked. It can be an excellent way to alert customers to news and offers, and for scheduling and automating emails, Mailchimp is one of the best tools around. You can create a variety of email campaigns, including automated emails for product retargeting, recommendations and abandoned carts. It also includes social media features, although currently only Facebook and Instagram are supported.

These tools can help you execute your content plans quickly and effectively, automating the sending process and freeing you up for other tasks. Your content plan should be used as the basis from which you populate these tools, as it provides a consistent schedule and allows you to build a consistent “brand voice”.

Analysing the impact

A fundamental (and often overlooked) part of digital marketing is the process of periodically analysing and refining the effectiveness of your strategy. It’s no good spending hours crafting social posts and emails if they aren’t offering any real-world benefit, but the only way you’ll know if your strategy is achieving your goals is if you take the time to analyse the results.

The best tools for doing so depend on your primary marketing channels. Facebook, Twitter and LinkedIn all have inbuilt analytics dashboards, which offer the largest and most granular datasets regarding the performance of your social media activity, but they can only show you what actions people have taken on that platform. That’s fine if all you’re interested in is brand awareness, but if your KPIs are based around metrics like conversion rates (such as people completing purchases or signing up for events and newsletters), you’ll need additional ways of measuring them.

If your goal is to drive people to your website, Google Analytics is the gold standard for identifying and categorising the behaviour of visitors to your site. Not only can you track how many people visited your site, what pages they visited and how long they spent on them, you can also see how they arrived on your site.

Google Analytics can show you what proportion of your visitors came from LinkedIn, Twitter, Reddit, Facebook, email newsletters, and so on. This will allow you to focus your social media strategy on the right channels, either focusing your attention on the ones which are most successful, or investing effort in increasing your presence on the channels which have the most room for improvement.

Be careful though: While Google Analytics can segment traffic by source, it can’t identify which traffic has come from posts on your account versus other accounts on the same network. For example, if your Facebook page posts a link to your blog post on cloud storage, and a large number of fans independently post the same link without having seen your original Facebook post, Google Analytics will lump any clickthroughs from any of those posts together under the same banner.

Whichever analysis tools you use, the key is to regularly audit how well your planned social activity has performed, and then use the data from that to inform the planning of the next phase – at which point, the whole process starts all over again.


All of the tools discussed above work exceedingly well together, and should be used in conjunction with each other to build a marketing tech stack which makes creating, executing and maintaining effective marketing campaigns much simpler than doing everything on-the-fly. However, these tools can be made even more powerful by using workflow automation tools to integrate them all together.

Various products exist to perform this function, but few are as accessible, as powerful or as widely supported as Zapier. This easy-to-use but surprisingly deep tool lets you use rules-based conditional logic to construct powerful integrations between different applications. For instance, you can create a rule that means when an event is added to a specific Google Calendar or a card is added to a specific list in Trello, Buffer will automatically create a social post based on the details within that card or event. You could also create an integration with MailChimp which posts to Buffer any time a new subscriber joins your mailing list.

When used correctly, automation can be the most effective tool in a digital marketer’s arsenal, helping you to create seamless marketing workflows that semi-autonomously build rich, multi-channel campaigns with minimal oversight required.

CRM, lead generation and paid ads

While it can be used simply for increasing brand awareness, marketing is often intended to directly fuel sales. For businesses that want to deepen the links between their marketing and sales operations, customer relationship management (CRM) software is an excellent way to increase collaboration between the two functions. CRM systems allow you to keep track of data on specific customers, be they companies or individuals, including contact details recent purchases and what they’re interested in.

Using a CRM system allows you to send previous customers emails and offers for similar products, and gives your sales team a way to track which marketing materials customers have previously engaged with. Numerous cloud-based CRMs are available, including big names like Salesforce and Hubspot, as well as less well-known providers like Zoho and SugarCRM.

CRM systems also work very well with lead generation – a branch of digital marketing which involves providing customers with a resource, such as a free trial of a service, early access to a product or service, or even a technical whitepaper, in exchange for their contact details. These details can then be used to target these customers with pitches or promotions for products they may be interested in, assuming they’ve explicitly consented to being contacted for this purpose.

You can capture these leads in a number of ways – many CRM platforms include tools for building lead capture forms or integrating with third-party form providers. Facebook and LinkedIn, meanwhile, both include built-in lead capture capabilities as part of their advertising toolkits, allowing you to quickly and easily start gathering new leads.

This kind of paid-for advertising can be an excellent marketing method, too. All of the major social media platforms allow businesses of any size to quickly and cheaply set up ad campaigns on their platforms, pushing them out to a wider number of users than their organic posts would reach. This can be a great way to quickly boost visibility of campaigns, particularly ones that align well with specific social network users.

Outside of social networks, Google also offers advertising across many of its products – most notably YouTube and the core search product. You can pre-set your budget, and will only be charged when customers take specific actions such as clicking through to your site.

Judicious use of paid advertising can be used to highlight the most important elements of your marketing campaigns, used in conjunction with the data from your organic marketing activity to maximise the effectiveness of your key messages.

Digital marketing is a complex field, and one that can seem daunting for the uninitiated; there’s a wealth of different tools, strategies and resources out there. You don’t have to use all of the approaches outlined above – it’s perfectly acceptable to start with one or two and then build out your capabilities as you go – but by integrating your planning, publishing, tracking and analysis tools into a unified marketing tech stack, you can increase the ease, efficiency and effectiveness of your digital marketing capabilities, boosting your activity and creating more value for your organisation.

VMware doubles down on Kubernetes and hybrid cloud

Adam Shepherd

26 Aug, 2019

As part of its annual VMworld conference, software giant VMware has today announced a number of new products aimed at strengthening its position within hybrid cloud and Kubernetes environments, as well as a broad range of updates across its Workspace ONE portfolio.

The main focus was a new portfolio of products dubbed VMware Tanzu, designed to help organisations marry their VMware and Kubernetes deployments. As part of the announcement, the company previewed two of the products which will come under the Tanzu umbrella: Tanzu Mission Control and Project Pacific.

Acting as a single control pane for all an organisation’s Kubernetes clusters, Tanzu Mission Control allows IT teams to manage clusters across public clouds, managed services, vSphere and more, using the same tools that they use to manage their VMs. It will enable one-click policy management for operators, and will feature broad integrations with the rest of VMware’s portfolio to provide tools such as visibility and health monitoring.

The other major announcement was Project Pacific – a new product which will be part of Tanzu when it’s officially launched, and which VMware global field and industry CTO Chris Wolf described as “the most significant innovation that’s come out of our VMware vSphere product line in the last 10 years”.

In essence, Project Pacific is an effort to re-architect VMware’s vSphere management software into a Kubernetes-native platform where organisations can manage both VMs and Kubernetes containers via one control plane. Pacific will also introduce a container runtime to the hypervisor, as well as ESXi native pods designed to blend the best elements of VMs and Kubernetes pods.

“Today, I’m excited to announce Project Pacific: an effort to embed Kubernetes directly into vSphere,” VMware CEO Pat Gelsinger told attendees. “Project Pacific unites vSphere and Kubernetes, and thanks to developers and operators, Project Pacific establishes vSphere as the platform for modern applications.”

“We’re building Kubernetes deep into vSphere, and along the way, we’re actually making vSphere a much better place to run Kubernetes,” added Joe Beda, Kubernetes co-creator and VMware principal engineer. “Operations people have one platform where they can manage all of their resources, including virtual machine and Kubernetes resources. And then application teams will get a self service experience built on proven Kubernetes API apps.”

In fact, Beda revealed, workloads running on ESXI with Project Pacific were capable of running up to 8% faster than on bare metal servers.

While Tanzu and Project Pacific are still some way from general availability with no firm release in sight, VMware advised customers wanting to prepare themselves for the new capabilities to start by adopting PKS, its pre-existing Kubernetes deployment tool. Beda stated that PKS is acting as the on-ramp for Project Pacific, and Wolf noted that “this is your glimpse of the future of PKS”. As Tanzu and Pacific are still only technical previews, there is currently no information on when they might actually be commercially released.

Another major announcement made at the show is the launch of VMware’s CloudHealth Hybrid. Based on last year’s acquisition of CloudHealth Technologies, the new tool expands VMware’s CloudHealth monitoring product to cover not just public cloud deployments but also on-prem and hybrid VMware deployments. Customers will be able to set resource usage policies which trigger automatic alerts when violated, as well as generating robust TCO reports for their full cloud environment. The company’s Wavefront observability tool has seen updates too, with aesthetic tweaks, new alert-driven automations and improved Kubernetes monitoring.

The launch of CloudHealth Hybrid is one of a number of announcements and updates around VMware’s hybrid cloud strategy. The company has unveiled vRealize Operations 8.0, which focuses in large part on applying machine learning algorithms and real-time analytics, in order to detect issues and optimise performance in real-time with minimal human intervention. On top of this, the company is previewing a new cloud-based version of this software, the vRealize Operations Cloud, for its VMware Cloud on AWS customers. Capacity and cost management will also be a major focus, with new planning and optimisation tools across a range of different environments.

Complementary to this is the announcement of another tech preview, codenamed ‘Project Magna’. This is another effort to use AI and machine learning to drive data centre automation, aimed specifically at vSAN. Magna is a self-managing SaaS product which uses real-time data to optimise the read and/or write performance of customers’ vSAN deployments.

In addition to this, VMware’s vRealize Automation 8.0 software also brings more improved governance, cloud-agnostic blueprints for service modelling and integrations with a number of DevOps tools. As with vRealize Operations, a SaaS-based version of this software is also being previewed, and the on-premise vRealise Suite 2019 has also been announced, bringing its core components up to the latest version. All of the new vRealise tools, as well as CloudHealth Hybrid, will be available by the end of VMware’s third financial quarter, which falls on 1 November this year. As with Project Pacific, no firm timeline has been given for the availability of Project Magna.

VMware Cloud on Dell EMC, the company’s data centre as a service offering co-engineered with its parent company, has also gone into general availability. Sadly though, this is only for US customers – there’s no firm date for availability in other territories as yet. The company did, however, announce that Equinix will be operating as a hosting partner for the service, allowing customers to take advantage of it through their facilities.

Public cloud was not ignored, however. Following its initial reveal by Gelsinger, Michael Dell and Microsoft CEO Satya Nadella at Dell Technologies World earlier in the year, VMware also announced the availability of native VMware support on Microsoft Azure. The service is available in Western Europe and the US as of today, with territories in Australia and South following by the end of 2019 and additional regions including northern Europe by the end of Q1 next year.

AWS’ status as a VMware ‘preferred partner’ was in little doubt, and the company has announced a raft of new technologies to support customers deploying on Amazon’s cloud. These include new tools courtesy of an expanded partnership with Nvidia. The two companies have worked together to bring Nvidia’s GPU virtualisation technology to VMware’s AWS offering, as well as to its core vSphere product. Designed to power AI and data science workloads, the new tools will allow IT to manage these workloads using the same tools as their other applications, as well as integrating them with other VMware and AWS products.

In addition, the company announced that as of today, VMware Cloud on AWS customers will be able to take advantage of a new Cloud Migration tool. This tool, as well as other cloud workflows, are expected to come to VMware Cloud on Dell EMC and VMware Cloud on AWS Outposts in the future. A disaster recovery as a service offering, built with Dell EMC and AWS S3, will also be introduced for VMware Cloud on AWS by the end of October. These customers can also now take advantage of VMware’s Bitnami-powered Cloud Marketplace, as can the company’s Cloud Provider Partners.

Moving onto VMware’s desktop virtualisation business, the conference has seen a number of product updates and new features announced for Workspace ONE. The most attention-grabbing of these is a new digital assistant, powered by IBM Watson and designed to speed up the onboarding process for new employees, as well as assist with simple level one support tasks like wireless troubleshooting or ticket creation.

It also unveiled a new Workspace ONE Intelligence service called Digital Employee Experience Management, designed to help IT automatically detect and resolve problems with employee endpoints through real-time data analysis, which is currently in preview. In order to aid with remediation efforts, Workspace ONE Assist (previously called Workspace ONE Advanced Remote Management) can now support Windows and macOS devices, in addition to Android, iOS, and Windows CE.

In addition to this, VMware is increasing Workspace ONE’s device support across the board; support for iOS 13 and iPadOS support is planned for later in the Autumn, augmented by additional management capabilities for Apple devices, Google Android and ChromeOS devices will benefit from new monitoring and migration tools, and admins can now use a new AirLift migration tool to onboard Windows 10 devices with SCCM collections and GPOs intact.

Finally, VMware Horizon also benefits from new capabilities. VMware Horizon Services for Multi-Cloud brings automatic management tools which allow employees to log into the most suitable virtual workspace, whether it’s hosted in an on-premise or cloud-based environment. New management services will also enable the use of one-to-many package deployment tools, and will surface Horizon data for the purposes of performance management. For customers that are still using a pre-existing perpetual license for Horizon 7, VMware has now introduced the VMware Subscription Upgrade Program for Horizon, which the company promises will allow customers to upgrade to Horizon Universal Licenses “at a price reflecting the original value”.

What to expect from VMworld US 2019

Adam Shepherd

23 Aug, 2019

Time, as the saying goes, makes fools of us all. When I first sat down to write this column on Wednesday afternoon, it had been less than a week since VMware announced its intention to purchase fellow Dell Technologies brand Pivotal. While the news would doubtless be on everyone’s mind, I assumed we wouldn’t actually hear anything about it at this year’s VMworld, and I structured my predictions accordingly. After all, I thought, surely they’re not likely to close such a big acquisition in less than a fortnight.

Well, you know what they say about assumptions.

In what I suspect is a move designed specifically to make my life more difficult, CEO Pat Gelsinger has just confirmed that not only is VMware purchasing Pivotal (for the princely sum of $2.7 billion, no less) it’s also snapping up security firm Carbon Black.

In light of these developments, it doesn’t take a genius to figure out what the big themes of this year’s show are going to be. While we probably won’t learn too much specifically about how Gelsinger intends to integrate his new toys into the company – the deals won’t close until the end of January next year – expect an update on VMware’s strategy and roadmap with security and cloud at the centre.

In terms of specific product updates, public cloud is probably going to feature heavily. Last year’s VMworld events (both European and US variants) saw the company announce deeper and more powerful integrations with Amazon Web Services across multiple products and use-cases, which is likely to continue this year.

Dell Technologies World in April also saw VMware and Microsoft surprise delegates by jointly announcing native VMware support on Azure. Expect more news on that front too, as the two companies deepen their partnership. If I had to guess, we’d bet that part of this will be an integration of some kind between Azure Networking and VMware NSX – a product that will probably be the focus of a significant portion of VMware’s show.

To round out the ‘Big Three’ public cloud providers, I wouldn’t be surprised to hear more news about support for VMware on Google Cloud Platform, either. First announced late last month, the combination of VMware Cloud Foundation with GCP gives VMware pretty much comprehensive coverage of the public cloud landscape.

Moving on from public cloud, the general availability of VMware Cloud on Dell EMC – the company’s subscription-based on-premise data centre as a service offering – is likely to be announced at the event, too. Also announced at Dell Technologies World this year, the service is scheduled for the second half of this year, but has yet to be formally rolled out to customers.

Regardless, the stage is set for a packed-to-bursting show. VMware has gone from strength to strength in recent years, cementing itself as the go-to virtualisation provider and proving its technical nous through smart investments in areas like software-defined networking and containerisation. With two more substantial companies being brought into the fold, we’re interested to see where the company goes from here.

Druva launches intelligent storage tiering for AWS

Adam Shepherd

21 Aug, 2019

Cloud-based data protection firm Druva has today announced a new storage tiering system for AWS, with the aim of helping customers optimise their storage spending across hot and cold storage.

The new system supports AWS’ S3, Glacier and Glacier Deep Archive offerings, and Druva claims that customers can benefit from a potential reduction of up to 50% in total cost of ownership. Clients can either let Druva automatically handle the tiering of their data for a minimum of hassle, or manually specify the tiering system they want to use for closer oversight.

The intelligent storage system also includes a central data management dashboard, machine learning-powered data protection, and one-click policy management actions.

“IDC estimates approximately 60% of corporate data is ‘cold,’ about 30% ‘warm’ and 10% ‘hot,'” said Phil Goodwin, director of research at IDC. “Organisations have typically faced a tradeoff between the cost of storing ever increasing amounts of data and the speed at which they can access the data. Druva’s collaboration with AWS will allow organisations to tier data in order to optimise both cost and speed of access. Customers can now choose higher speed for the portion of data that needs it and opt for lower costs for the rest of the data that does not.”

“Enterprises are constantly searching for ways to shift budget to innovation projects,” said Druva’s chief product officer, Mike Palmer. “Driving down the cost of storage and administration is seen by the enterprise as the best opportunity to move money from legacy. Beyond cost-savings, the ability to see multiple tiers of data in a single pane of glass increases control for governance and compliance and eventually analytics, and shows customers that the public cloud architecture decreases risk, cost and enables them to deliver on the promise of data.”

The company also announced the general availability of its disaster recovery as a service product for AWS. Like it’s storage tiering, it also claims a potential TCO reduction of up to 50% as well as faster recovery times, easier management and improved reporting functions.