All posts by Adam Shepherd

Pure Storage beefs up cloud support


Adam Shepherd

17 Sep, 2019

Pure Storage has today announced the general availability of new data management tools for Azure and AWS as part of its annual Accelerate conference in Austin, Texas, improving its public cloud support and further strengthening its position in the multi-cloud space.

Starting with AWS, the company has announced that its Cloud Block Store for AWS product, first revealed last year, is now generally available for all customers. The product is a wholly software-based offering, allowing customers to use the company’s Purity management software to manage their AWS storage.

The initial beta version of Cloud Block Store used EC2 compute instances with EBS as a storage layer, but the configuration has since changed. As Pure Storage vice president of strategy Matt Kixmoeller explained, the conclusion was that EBS was not reliable enough for the product’s requirements.

“As we worked closely with Amazon, what we found was that EBS didn’t have the reliability characteristics that a Tier 1 storage array needs,” he said. “In particular, there are challenges around coordinated failures, where multiple volumes can fail at once. And so we completely re-architected the backend layer to run natively on S3. S3 is Amazon’s most durable, most reliable storage tier by far – 11 nines of durability.”

“And so we use EBS as a cache to deliver high performance, but persist data on S3. And if you look at most customers, they really treat S3 as their cloud storage. So this solution becomes a way for us to bring a Tier 1 block experience to use in the Amazon cloud storage S3, that customers are most familiar with, and most trust.”

Part of the goal with the new service is to enable workloads to move seamlessly in both directions; from the cloud to the data centre, as well as from the data centre to the cloud. It uses the same management tools and APIs as Pure’s on-prem management software, as well as featuring the ability to run across two availability zones in active/active configuration.

Cloud Block Store for AWS will be available via the AWS Marketplace on either a month-to-month or a one-year contract. Customers who want something more long term can get contracts ranging from one to three years by purchasing through ‘Pure-as-a-Service’, which is a rebranded version of the company’s Evergreen Storage Service, now effectively acting as a subscription-based consumption program.

The other major cloud announcement was the availability of CloudSnap for Azure, a built-in backup mechanism for FlashArray products which lets the Purity management software seamlessly and transparently move snapshots to the public cloud. CloudSnap was initially launched last year with AWS support, but has now been expanded to Azure as well. This, Kixmoeller said, was an excellent example of Pure’s intentions to extend its tools to a multitude of different cloud providers.

“Our strategy at Pure is to absolutely deliver these services as multi-cloud,” he said. “So Cloud Block Store, we started with Amazon – that’s the natural place to start. But as we see more and more adoption, and that gets more mature, and we will of course proliferate to other clouds.”

“It’s not an easy thing for us to snap our fingers and have it available on all three clouds, because we’re doing the hard work of integrating it deeply. And so this is our first example of bringing something to a second cloud.”

As part of the show, the company also announced a capacity-driven flash-based secondary storage appliance with quad-layer cell memory, as well as a new plug-in DirectMemory module for FlashArray//X appliances offering an instant performance boost.

Now is the time to embrace remote working


Adam Shepherd

6 Sep, 2019

I’ll be honest; it’s been a little hard to concentrate on writing this month’s column. As I write, Boris Johnson and the Conservative party have lost their parliamentary majority, somehow plunging the Brexit situation into even more chaos.

This latest phase of the debacle has got me thinking about what will happen to businesses in the event of a no-deal Brexit. The potential negative impacts have been well-documented, from a shutdown of data transfers with the EU to severe delays on international shipments, but the issue that keeps playing on my mind is the strong possibility of a resulting skills crunch.

A clampdown on immigration from EU countries has been high on the list of hardcore Brexiteers’ priorities, which will likely reduce the pool of skilled tech workers entering the country. Even if European developers and specialists aren’t barred from entering the country following Brexit, and the ones already here not compelled to return home, one could hardly blame them for choosing to take their talents to a more welcoming and less chaotic nation.

A sudden lack of locally-based technology talent is a real possibility that businesses have to confront, but there are ways around it. One is to focus on upskilling or cross-skilling existing staff, but that takes time – time that organisations may not have if the impact of no deal is as sudden as some are predicting.

A better option is to embrace remote working. The fact is that, when it comes to technical roles, there’s very little need for all your staff to work out of a corporate office. Cloud infrastructure platforms and SaaS tools allow companies to manage and administrate the vast majority of their IT remotely if they so choose, and even when it comes to physical infrastructure or hands-on IT support, you only really need a small in-house team to effect physical changes, while off-site employees handle the configuration. This is even more true when it comes to developers and software engineers, who can be based anywhere in the world and still be just as effective at their jobs.

For many businesses, the biggest worry with moving to remote working is making sure staff remain connected with colleagues and managers, and continue to be engaged with the business. It’s easy for remote workers to feel isolated or ostracised if efforts aren’t made to include them, but collaboration platforms like Slack, Microsoft Teams, Dropbox, Skype and Google Hangouts are all great tools for ensuring they still feel like part of the team.

Rolling out these tools can have benefits for employees outside of IT as well, increasing productivity and efficiency, as well as allowing office-based staff to work flexibly if they want. Ensuring new systems are adopted can be a challenge, of course, but the long-term benefits are worth it.

By making use of these technologies, organisations can make sure they can recruit and retain European tech staff in the event of a no-deal Brexit, but time is of the essence. If the walls go up on 31 October and you don’t already have wheels in motion to implement remote working within your business, you’ll be on the back foot compared to rivals that do. You may be tempted to wait and see how things pan out, but let’s be honest – it’s far better to be prepared.

View From the Airport: VMworld US 2019


Adam Shepherd

30 Aug, 2019

I think it’s fairly safe to say that I picked a good year to visit VMworld US for the first time. While I’ve been to its European equivalent, this was the first year I went to the main event and it was something of a doozy.

Not only did we get a nice bit of pre-conference sizzle with the news that VMware is acquiring Carbon Black and Pivotal, but the entire show was also a festival of product updates and previews. More than anything else, it felt like a statement of intent from Gelsinger and his comrades, setting out the company’s stall for the future.

The big focus of the show – and of VMware’s main announcements – was Kubernetes. The company is betting big on the container technology as the future of application development, with plans to weave it into vSphere with Project Pacific, and use Pivotal and Bitnami’s technology to make VMware even more attractive to Kubernetes developers. Virtually every main-stage announcement featured Kubernetes in some capacity, and VMware veteran Ray O’Farrell is being put in charge of getting that side of the business (including the forthcoming Pivotal integrations) running smoothly.

All the new Kubernetes-based products – Project Pacific, Tanzu and the like – are still in tech preview with no release date in sight and, honestly, that’s probably a good thing. I’m really not sure how many of VMware’s customers are ready to start deploying containers at scale. Mind you, making Kubernetes management a core part of VMware’s capabilities may well go a long way towards encouraging adoption.

It feels like a future-proofing measure more than anything else. Gelsinger is a sharp guy and when he says that containers are the future, he’s not wrong. It may not have reached mass adoption yet, but it’s growing fast, which isn’t surprising given the technology’s proven benefits. This isn’t a pivot though; VMs aren’t going anywhere, as Gelsinger himself has been quick to point out. He notes that all the companies operating Kubernetes at scale – Google, Microsoft, Amazon, et cetera – operate them inside VMs. More to the point, it’ll be a long time yet before Kubernetes gets anywhere close to rivalling VMs in terms of the number of production workloads.

Between the new possibilities promised by Project Pacific, the increasing focus on multi-cloud infrastructures and the forthcoming integration of Carbon Black’s technology into the product line, VMware looks like a company at the absolute top of its game, cementing its dominance of the virtualisation market and paving the way for that dominance to continue long into the future. If Gelsinger, O’Farrell and the rest of the team can pull off everything they’ve promised, then customers and admins have a lot to look forward to.

Carbon Black execs reveal post-acquisition plans


Adam Shepherd

29 Aug, 2019

Last week, VMware threw the tech industry a curveball when CEO Pat Gelsinger announced that not only would it be acquiring Pivotal, as had been announced the previous week, it was also snapping up security firm Carbon Black. While the deal isn’t expected to close until the end of January next year, the company has devoted a substantial chunk of this year’s VMworld conference to discussing the acquisition, and what it means for the future of both companies.

For Carbon Black CEO Patrick Morley, the acquisition presents a huge opportunity for the company to expand its capabilities, and he sees a number of areas where being part of VMware can help it protect its customers in new ways.

“Pending close, I think there’s a number of opportunities,” he tells Cloud Pro. “The biggest one’s end user computing. Management and security go hand in hand, so end user computing is a huge opportunity for us.”

A substantial part of this is integration with Workspace ONE, VMware’s desktop virtualisation product. It’s one of four key integrations with its existing portfolio that VMware has already identified as priorities once the deal goes through. Not only does it make sense from a customer use-case perspective, VMware COO Sanjay Poonen pointed out that many Workspace ONE customers are also Carbon Black customers, a fact which supposedly influenced the decision to acquire the company.

While both VMware and Carbon Black executives have indicated that the company intends to keep the Carbon Black brand alive once the deal closes, and there are no immediate plans to shutter any of its services, Gelsinger told Cloud Pro that the goal is eventually to weave Carbon Black’s technology into VMware’s platform rather than offering it via standalone applications.

“The plans are to bring these integrated solutions together,” he says. “You could imagine you’re going to buy your Workspace ONE with Carbon Black. And these just end up being features. The thing is, we don’t want customers to be ‘buying point products for point use cases’ – buy a platform that gives you lots of those benefits.”

“Customers today will have a Tanium agent, Right? And they’ll have a McAfee agent, and they’ll have a Qualys agent. They’ll also have a Workspace ONE agent for management. So I’ve got four agents on the client. I have customers literally, who have 17 agents on every PC. 17 agents. What are you talking about? One was our goal, as we collapse all of those use cases into one underlying agent.”

Don’t wait, integrate

Being owned by VMware will make Carbon Black a de facto part of the Dell Technologies family, which also opens up other avenues for expanding its endpoint protection.

“Obviously, the Dell family is another capability, because Dell increasingly is providing security to its customers, as part of the laptops and other hardware that they’re providing. And so if we can build security right into that, it’s hugely advantageous too,” Morley says. “You will certainly see us work with Dell – again, pending close – to actually give customers the option to be able to put security right onto the machine, if they so choose.”

If Dell’s business laptops come preloaded with a free subscription to Carbon Black’s endpoint detection and response (EDR) service, this could be hugely beneficial for organisations. The more exciting prospect, however, is the potential impact Carbon Black’s technology can have on application security for VMware customers.

“The second piece is the work that’s already been done around app defence, which is actually building security hooks right into vSphere,” Morley explains.

This integration would enable agentless protection of applications running in vSphere, improving both application performance and detection rates. This would be groundbreaking and, if successfully integrated, has the potential to radically improve the security of organisations running vSphere.

Elsewhere, VMware is planning to integrate Carbon Black’s technology into its NSX platform to provide more in-depth network security analytics, as well as partnering it with another recent acquisition – Secure State – to address security configuration challenges. However, while the acquisition will allow Carbon Black to expand into new kinds of protection, the company executives are also extremely excited about its potential to supercharge its existing services.

One of the linchpins of Carbon Black’s technology is the collection and analysis of security data from all of the endpoints that are running its agent. At the moment, that consists of 15 million endpoints, but if Carbon Black’s agent is incorporated into vSphere or Workspace ONE, that total significantly increases overnight.

Room for growth

“We’re super excited to be able to leverage the reach that Dell EMC and VMware bring to the equation here. I mean, there’s 70,000 partners that we’re going to be able to tap into,” says Carbon Black’s senior vice president of corporate and business development Tom Barsi. “That’s really where you’re talking about adding a zero to the number of customers we’re touching.”

In addition to improving its protection capabilities, this increase in footprint and telemetry will give more fuel than ever to Carbon Black’s Threat Analysis Unit (TAU), which conducts research into security trends as well as analysing new and emerging threat actors and attack methodologies. This research, Morley promises, will most certainly continue and will in fact likely expand once the company joins VMware.

Carbon Black’s executives seem to be exceedingly positive about the prospect of joining the VMware family, which should come as no surprise. Carbon Black has been a technically-focused company since its inception – Morley notes that the company was founded by a team of actual hackers – and this emphasis on technology and engineering is at the core of its new owner’s values.

“I’m really excited,” Carbon Black CTO Scott Lundgren told Cloud Pro. “As CTO, it’s pretty amazing to have an opportunity to work with a highly technical leadership team. It starts with Pat. As ex-CTO of Intel, he’s got a great reputation, and he deserves it. He’s fantastically technical, so he understands the problem. He knows what it takes to actually address it, he can act with confidence, because he knows what’s going on under the hood. But it isn’t just Pat, the whole team is deeply technical [and has] a lot of expertise in a wide variety of technical fields across the board. It’s really great to see.”

“We’re going to have some work to do, obviously, to scale up but it’s very tractable, as long as you’ve got the right mindset at the top – and Pat has that.”

How to build an effective marketing strategy


Adam Shepherd

27 Aug, 2019

In the internet age, digital marketing is an essential part of any business, but it can be hard to keep up with the ever-changing landscape of social media, email marketing and online ads.

Thankfully, there is a huge range of tools you can call on to help build and execute an effective digital marketing strategy.

First things first – don’t be lured into the trap of thinking that digital marketing is just something that you can pass off to an intern, or that it’s as simple as putting out the odd Facebook post. In reality, it’s a full-time job, and one that’s more complicated than you might think.

The first step is to nail down who your target audience is. This can include demographic data such as age, location, job title and industry, as well as their broad interests. This will inform which marketing channels you focus your energies on, as well as the best marketing tactics to use. Ads for fitness gear, for example, are much more likely to be successful on leisure and lifestyle platforms like Instagram and Facebook, rather than a business network like LinkedIn.

You should also work out what your overall goal is. It may be as simple as increasing general brand awareness, but it could also include things like driving product sales, increasing visits to your website and boosting event attendance. Different outcomes will require different strategies.

Planning your content

Planning marketing activity in advance is essential, whether it be social media posts, email blasts or anything else. Not only does this allow you to structure your activity around business events (such as promotional offers or seasonal trends) it also means you can schedule your activity in bulk, so you don’t have to worry about it. You should map out when you’re going to put out posts and emails, as well as deciding on which copy and images you’re going to use beforehand.

Trello, a project management tool based on the Kanban system, is one of the best tools for planning marketing activity, as it includes a calendar view, as well as high levels of customisation and robust filtering systems. For teams that aren’t planning to put out high volumes of marketing material, Google Calendar can also be a useful asset for tracking when it’s scheduled to go out, although its comparative lack of filtration options make it less suited to organisations with large needs.

Your content plan should form the basis of your marketing strategy. Although you can respond to trends, current events and spikes in interest on-the-fly, you should take care to follow your content plan as well. Otherwise, your marketing activity runs the risk of becoming unfocused and scattershot, which will make it less effective.

Publishing your posts

Once you’ve planned when you want to do your marketing activity, you’ll need to actually schedule and launch it. There are a variety of tools you can use to do this, each with different advantages and specialisations. For social media, Buffer Publish works well as a one-stop-shop, allowing you to schedule posts on Facebook, Instagram, Twitter, Pinterest and LinkedIn.

Buffer does have some disadvantages, such as the fact that it doesn’t always support the platform’s full feature-set. For example, while you can attach an image, video or link to a Facebook post via Buffer, you can’t use it to check into locations, publish polls or tag branded content. Similarly, you can’t use Buffer to publish documents on LinkedIn.

Tweetdeck is worth consideration – once independent but now wholly owned by Twitter, it’s among the most useful and fully-featured tools for managing multiple Twitter accounts and allows you to create custom dashboards with columns for tracking mentions, hashtags, direct messages and more.

Email marketing may be a venerable practise, but its value and importance shouldn’t be overlooked. It can be an excellent way to alert customers to news and offers, and for scheduling and automating emails, Mailchimp is one of the best tools around. You can create a variety of email campaigns, including automated emails for product retargeting, recommendations and abandoned carts. It also includes social media features, although currently only Facebook and Instagram are supported.

These tools can help you execute your content plans quickly and effectively, automating the sending process and freeing you up for other tasks. Your content plan should be used as the basis from which you populate these tools, as it provides a consistent schedule and allows you to build a consistent “brand voice”.

Analysing the impact

A fundamental (and often overlooked) part of digital marketing is the process of periodically analysing and refining the effectiveness of your strategy. It’s no good spending hours crafting social posts and emails if they aren’t offering any real-world benefit, but the only way you’ll know if your strategy is achieving your goals is if you take the time to analyse the results.

The best tools for doing so depend on your primary marketing channels. Facebook, Twitter and LinkedIn all have inbuilt analytics dashboards, which offer the largest and most granular datasets regarding the performance of your social media activity, but they can only show you what actions people have taken on that platform. That’s fine if all you’re interested in is brand awareness, but if your KPIs are based around metrics like conversion rates (such as people completing purchases or signing up for events and newsletters), you’ll need additional ways of measuring them.

If your goal is to drive people to your website, Google Analytics is the gold standard for identifying and categorising the behaviour of visitors to your site. Not only can you track how many people visited your site, what pages they visited and how long they spent on them, you can also see how they arrived on your site.

Google Analytics can show you what proportion of your visitors came from LinkedIn, Twitter, Reddit, Facebook, email newsletters, and so on. This will allow you to focus your social media strategy on the right channels, either focusing your attention on the ones which are most successful, or investing effort in increasing your presence on the channels which have the most room for improvement.

Be careful though: While Google Analytics can segment traffic by source, it can’t identify which traffic has come from posts on your account versus other accounts on the same network. For example, if your Facebook page posts a link to your blog post on cloud storage, and a large number of fans independently post the same link without having seen your original Facebook post, Google Analytics will lump any clickthroughs from any of those posts together under the same banner.

Whichever analysis tools you use, the key is to regularly audit how well your planned social activity has performed, and then use the data from that to inform the planning of the next phase – at which point, the whole process starts all over again.

Automation

All of the tools discussed above work exceedingly well together, and should be used in conjunction with each other to build a marketing tech stack which makes creating, executing and maintaining effective marketing campaigns much simpler than doing everything on-the-fly. However, these tools can be made even more powerful by using workflow automation tools to integrate them all together.

Various products exist to perform this function, but few are as accessible, as powerful or as widely supported as Zapier. This easy-to-use but surprisingly deep tool lets you use rules-based conditional logic to construct powerful integrations between different applications. For instance, you can create a rule that means when an event is added to a specific Google Calendar or a card is added to a specific list in Trello, Buffer will automatically create a social post based on the details within that card or event. You could also create an integration with MailChimp which posts to Buffer any time a new subscriber joins your mailing list.

When used correctly, automation can be the most effective tool in a digital marketer’s arsenal, helping you to create seamless marketing workflows that semi-autonomously build rich, multi-channel campaigns with minimal oversight required.

CRM, lead generation and paid ads

While it can be used simply for increasing brand awareness, marketing is often intended to directly fuel sales. For businesses that want to deepen the links between their marketing and sales operations, customer relationship management (CRM) software is an excellent way to increase collaboration between the two functions. CRM systems allow you to keep track of data on specific customers, be they companies or individuals, including contact details recent purchases and what they’re interested in.

Using a CRM system allows you to send previous customers emails and offers for similar products, and gives your sales team a way to track which marketing materials customers have previously engaged with. Numerous cloud-based CRMs are available, including big names like Salesforce and Hubspot, as well as less well-known providers like Zoho and SugarCRM.

CRM systems also work very well with lead generation – a branch of digital marketing which involves providing customers with a resource, such as a free trial of a service, early access to a product or service, or even a technical whitepaper, in exchange for their contact details. These details can then be used to target these customers with pitches or promotions for products they may be interested in, assuming they’ve explicitly consented to being contacted for this purpose.

You can capture these leads in a number of ways – many CRM platforms include tools for building lead capture forms or integrating with third-party form providers. Facebook and LinkedIn, meanwhile, both include built-in lead capture capabilities as part of their advertising toolkits, allowing you to quickly and easily start gathering new leads.

This kind of paid-for advertising can be an excellent marketing method, too. All of the major social media platforms allow businesses of any size to quickly and cheaply set up ad campaigns on their platforms, pushing them out to a wider number of users than their organic posts would reach. This can be a great way to quickly boost visibility of campaigns, particularly ones that align well with specific social network users.

Outside of social networks, Google also offers advertising across many of its products – most notably YouTube and the core search product. You can pre-set your budget, and will only be charged when customers take specific actions such as clicking through to your site.

Judicious use of paid advertising can be used to highlight the most important elements of your marketing campaigns, used in conjunction with the data from your organic marketing activity to maximise the effectiveness of your key messages.

Digital marketing is a complex field, and one that can seem daunting for the uninitiated; there’s a wealth of different tools, strategies and resources out there. You don’t have to use all of the approaches outlined above – it’s perfectly acceptable to start with one or two and then build out your capabilities as you go – but by integrating your planning, publishing, tracking and analysis tools into a unified marketing tech stack, you can increase the ease, efficiency and effectiveness of your digital marketing capabilities, boosting your activity and creating more value for your organisation.

VMware doubles down on Kubernetes and hybrid cloud


Adam Shepherd

26 Aug, 2019

As part of its annual VMworld conference, software giant VMware has today announced a number of new products aimed at strengthening its position within hybrid cloud and Kubernetes environments, as well as a broad range of updates across its Workspace ONE portfolio.

The main focus was a new portfolio of products dubbed VMware Tanzu, designed to help organisations marry their VMware and Kubernetes deployments. As part of the announcement, the company previewed two of the products which will come under the Tanzu umbrella: Tanzu Mission Control and Project Pacific.

Acting as a single control pane for all an organisation’s Kubernetes clusters, Tanzu Mission Control allows IT teams to manage clusters across public clouds, managed services, vSphere and more, using the same tools that they use to manage their VMs. It will enable one-click policy management for operators, and will feature broad integrations with the rest of VMware’s portfolio to provide tools such as visibility and health monitoring.

The other major announcement was Project Pacific – a new product which will be part of Tanzu when it’s officially launched, and which VMware global field and industry CTO Chris Wolf described as “the most significant innovation that’s come out of our VMware vSphere product line in the last 10 years”.

In essence, Project Pacific is an effort to re-architect VMware’s vSphere management software into a Kubernetes-native platform where organisations can manage both VMs and Kubernetes containers via one control plane. Pacific will also introduce a container runtime to the hypervisor, as well as ESXi native pods designed to blend the best elements of VMs and Kubernetes pods.

“Today, I’m excited to announce Project Pacific: an effort to embed Kubernetes directly into vSphere,” VMware CEO Pat Gelsinger told attendees. “Project Pacific unites vSphere and Kubernetes, and thanks to developers and operators, Project Pacific establishes vSphere as the platform for modern applications.”

“We’re building Kubernetes deep into vSphere, and along the way, we’re actually making vSphere a much better place to run Kubernetes,” added Joe Beda, Kubernetes co-creator and VMware principal engineer. “Operations people have one platform where they can manage all of their resources, including virtual machine and Kubernetes resources. And then application teams will get a self service experience built on proven Kubernetes API apps.”

In fact, Beda revealed, workloads running on ESXI with Project Pacific were capable of running up to 8% faster than on bare metal servers.

While Tanzu and Project Pacific are still some way from general availability with no firm release in sight, VMware advised customers wanting to prepare themselves for the new capabilities to start by adopting PKS, its pre-existing Kubernetes deployment tool. Beda stated that PKS is acting as the on-ramp for Project Pacific, and Wolf noted that “this is your glimpse of the future of PKS”. As Tanzu and Pacific are still only technical previews, there is currently no information on when they might actually be commercially released.

Another major announcement made at the show is the launch of VMware’s CloudHealth Hybrid. Based on last year’s acquisition of CloudHealth Technologies, the new tool expands VMware’s CloudHealth monitoring product to cover not just public cloud deployments but also on-prem and hybrid VMware deployments. Customers will be able to set resource usage policies which trigger automatic alerts when violated, as well as generating robust TCO reports for their full cloud environment. The company’s Wavefront observability tool has seen updates too, with aesthetic tweaks, new alert-driven automations and improved Kubernetes monitoring.

The launch of CloudHealth Hybrid is one of a number of announcements and updates around VMware’s hybrid cloud strategy. The company has unveiled vRealize Operations 8.0, which focuses in large part on applying machine learning algorithms and real-time analytics, in order to detect issues and optimise performance in real-time with minimal human intervention. On top of this, the company is previewing a new cloud-based version of this software, the vRealize Operations Cloud, for its VMware Cloud on AWS customers. Capacity and cost management will also be a major focus, with new planning and optimisation tools across a range of different environments.

Complementary to this is the announcement of another tech preview, codenamed ‘Project Magna’. This is another effort to use AI and machine learning to drive data centre automation, aimed specifically at vSAN. Magna is a self-managing SaaS product which uses real-time data to optimise the read and/or write performance of customers’ vSAN deployments.

In addition to this, VMware’s vRealize Automation 8.0 software also brings more improved governance, cloud-agnostic blueprints for service modelling and integrations with a number of DevOps tools. As with vRealize Operations, a SaaS-based version of this software is also being previewed, and the on-premise vRealise Suite 2019 has also been announced, bringing its core components up to the latest version. All of the new vRealise tools, as well as CloudHealth Hybrid, will be available by the end of VMware’s third financial quarter, which falls on 1 November this year. As with Project Pacific, no firm timeline has been given for the availability of Project Magna.

VMware Cloud on Dell EMC, the company’s data centre as a service offering co-engineered with its parent company, has also gone into general availability. Sadly though, this is only for US customers – there’s no firm date for availability in other territories as yet. The company did, however, announce that Equinix will be operating as a hosting partner for the service, allowing customers to take advantage of it through their facilities.

Public cloud was not ignored, however. Following its initial reveal by Gelsinger, Michael Dell and Microsoft CEO Satya Nadella at Dell Technologies World earlier in the year, VMware also announced the availability of native VMware support on Microsoft Azure. The service is available in Western Europe and the US as of today, with territories in Australia and South following by the end of 2019 and additional regions including northern Europe by the end of Q1 next year.

AWS’ status as a VMware ‘preferred partner’ was in little doubt, and the company has announced a raft of new technologies to support customers deploying on Amazon’s cloud. These include new tools courtesy of an expanded partnership with Nvidia. The two companies have worked together to bring Nvidia’s GPU virtualisation technology to VMware’s AWS offering, as well as to its core vSphere product. Designed to power AI and data science workloads, the new tools will allow IT to manage these workloads using the same tools as their other applications, as well as integrating them with other VMware and AWS products.

In addition, the company announced that as of today, VMware Cloud on AWS customers will be able to take advantage of a new Cloud Migration tool. This tool, as well as other cloud workflows, are expected to come to VMware Cloud on Dell EMC and VMware Cloud on AWS Outposts in the future. A disaster recovery as a service offering, built with Dell EMC and AWS S3, will also be introduced for VMware Cloud on AWS by the end of October. These customers can also now take advantage of VMware’s Bitnami-powered Cloud Marketplace, as can the company’s Cloud Provider Partners.

Moving onto VMware’s desktop virtualisation business, the conference has seen a number of product updates and new features announced for Workspace ONE. The most attention-grabbing of these is a new digital assistant, powered by IBM Watson and designed to speed up the onboarding process for new employees, as well as assist with simple level one support tasks like wireless troubleshooting or ticket creation.

It also unveiled a new Workspace ONE Intelligence service called Digital Employee Experience Management, designed to help IT automatically detect and resolve problems with employee endpoints through real-time data analysis, which is currently in preview. In order to aid with remediation efforts, Workspace ONE Assist (previously called Workspace ONE Advanced Remote Management) can now support Windows and macOS devices, in addition to Android, iOS, and Windows CE.

In addition to this, VMware is increasing Workspace ONE’s device support across the board; support for iOS 13 and iPadOS support is planned for later in the Autumn, augmented by additional management capabilities for Apple devices, Google Android and ChromeOS devices will benefit from new monitoring and migration tools, and admins can now use a new AirLift migration tool to onboard Windows 10 devices with SCCM collections and GPOs intact.

Finally, VMware Horizon also benefits from new capabilities. VMware Horizon Services for Multi-Cloud brings automatic management tools which allow employees to log into the most suitable virtual workspace, whether it’s hosted in an on-premise or cloud-based environment. New management services will also enable the use of one-to-many package deployment tools, and will surface Horizon data for the purposes of performance management. For customers that are still using a pre-existing perpetual license for Horizon 7, VMware has now introduced the VMware Subscription Upgrade Program for Horizon, which the company promises will allow customers to upgrade to Horizon Universal Licenses “at a price reflecting the original value”.

What to expect from VMworld US 2019


Adam Shepherd

23 Aug, 2019

Time, as the saying goes, makes fools of us all. When I first sat down to write this column on Wednesday afternoon, it had been less than a week since VMware announced its intention to purchase fellow Dell Technologies brand Pivotal. While the news would doubtless be on everyone’s mind, I assumed we wouldn’t actually hear anything about it at this year’s VMworld, and I structured my predictions accordingly. After all, I thought, surely they’re not likely to close such a big acquisition in less than a fortnight.

Well, you know what they say about assumptions.

In what I suspect is a move designed specifically to make my life more difficult, CEO Pat Gelsinger has just confirmed that not only is VMware purchasing Pivotal (for the princely sum of $2.7 billion, no less) it’s also snapping up security firm Carbon Black.

In light of these developments, it doesn’t take a genius to figure out what the big themes of this year’s show are going to be. While we probably won’t learn too much specifically about how Gelsinger intends to integrate his new toys into the company – the deals won’t close until the end of January next year – expect an update on VMware’s strategy and roadmap with security and cloud at the centre.

In terms of specific product updates, public cloud is probably going to feature heavily. Last year’s VMworld events (both European and US variants) saw the company announce deeper and more powerful integrations with Amazon Web Services across multiple products and use-cases, which is likely to continue this year.

Dell Technologies World in April also saw VMware and Microsoft surprise delegates by jointly announcing native VMware support on Azure. Expect more news on that front too, as the two companies deepen their partnership. If I had to guess, we’d bet that part of this will be an integration of some kind between Azure Networking and VMware NSX – a product that will probably be the focus of a significant portion of VMware’s show.

To round out the ‘Big Three’ public cloud providers, I wouldn’t be surprised to hear more news about support for VMware on Google Cloud Platform, either. First announced late last month, the combination of VMware Cloud Foundation with GCP gives VMware pretty much comprehensive coverage of the public cloud landscape.

Moving on from public cloud, the general availability of VMware Cloud on Dell EMC – the company’s subscription-based on-premise data centre as a service offering – is likely to be announced at the event, too. Also announced at Dell Technologies World this year, the service is scheduled for the second half of this year, but has yet to be formally rolled out to customers.

Regardless, the stage is set for a packed-to-bursting show. VMware has gone from strength to strength in recent years, cementing itself as the go-to virtualisation provider and proving its technical nous through smart investments in areas like software-defined networking and containerisation. With two more substantial companies being brought into the fold, we’re interested to see where the company goes from here.

Druva launches intelligent storage tiering for AWS


Adam Shepherd

21 Aug, 2019

Cloud-based data protection firm Druva has today announced a new storage tiering system for AWS, with the aim of helping customers optimise their storage spending across hot and cold storage.

The new system supports AWS’ S3, Glacier and Glacier Deep Archive offerings, and Druva claims that customers can benefit from a potential reduction of up to 50% in total cost of ownership. Clients can either let Druva automatically handle the tiering of their data for a minimum of hassle, or manually specify the tiering system they want to use for closer oversight.

The intelligent storage system also includes a central data management dashboard, machine learning-powered data protection, and one-click policy management actions.

“IDC estimates approximately 60% of corporate data is ‘cold,’ about 30% ‘warm’ and 10% ‘hot,'” said Phil Goodwin, director of research at IDC. “Organisations have typically faced a tradeoff between the cost of storing ever increasing amounts of data and the speed at which they can access the data. Druva’s collaboration with AWS will allow organisations to tier data in order to optimise both cost and speed of access. Customers can now choose higher speed for the portion of data that needs it and opt for lower costs for the rest of the data that does not.”

“Enterprises are constantly searching for ways to shift budget to innovation projects,” said Druva’s chief product officer, Mike Palmer. “Driving down the cost of storage and administration is seen by the enterprise as the best opportunity to move money from legacy. Beyond cost-savings, the ability to see multiple tiers of data in a single pane of glass increases control for governance and compliance and eventually analytics, and shows customers that the public cloud architecture decreases risk, cost and enables them to deliver on the promise of data.”

The company also announced the general availability of its disaster recovery as a service product for AWS. Like it’s storage tiering, it also claims a potential TCO reduction of up to 50% as well as faster recovery times, easier management and improved reporting functions.

Box: We’re in ‘wait-and-see mode’ with blockchain


Adam Shepherd

25 Jun, 2019

Box CEO Aaron Levie has confirmed that the company has no plans to integrate blockchain technology into its product portfolio, citing the fact that it’s still too early to have a meaningful impact for its customers.

The cloud collaboration company’s co-founder began his keynote at Box’s annual CIO Summit today by joking that he was teaching his new son – who was born late last month – ‘blockchain for babies’ so he is prepared for the future. Joking aside, however, Levie admitted that the company is not actively exploring blockchain technology.

“We have no specific products that we are working on at the moment, however, we do have people within the organisation that are either researching or always evaluating what might make sense,” he said.

“I think, frankly, we’re a little bit in wait and see mode to see where the trends are going. And we would certainly be there from a product standpoint, when we think it’s very meaningful for our customers… In general, it’s still probably early from a market standpoint, and relative to our technology.”

One branch of technology that Box is actively integrating into its portfolio, though, is artificial intelligence (AI) and machine learning (ML). The company has already begun deploying this technology on a limited basis via its Box Skills Kit feature, which allows customers to build their own integrations with other services, but Levie said that the company is planning to weave AI into its products more widely.

In particular, he says, data classification and security are areas in which the company could implement machine learning in order to benefit customers.

“You take something that used to be an unstructured blob of data, that we didn’t really know what was inside of it,” he said. “Now we can structure it, we can better help our customers manage it, and security and governance.”

In future years, the company is also looking into the possibility of adding complex AI to Box Relay to predict things like which action a person should take next as part of a workflow based on contextual information.

“It’s one thing to streamline a business process and describe that business process and software – it’s a whole other thing if you can actually go and automate that business process predictively or intelligently using AI or machine learning. And that’s the holy grail, frankly, for the entire industry,” Levie added. 

“But it’s something that we’re going to be investing quite a bit in, over the three to five year period.”

Welcome to the age of the platform


Adam Shepherd

17 Jun, 2019

It’s hard to argue against the idea that Dropbox has been a hugely influential company. It’s synonymous with consumer cloud storage and was one of the first companies to popularise the concept. In many ways, it was instrumental to the growth of cloud computing as a mainstream technology.

Now, however, it’s changing tack and reinventing itself. No longer content to merely be the place where users and teams store their documents, the company wants to become the connective tissue that links all the digital elements of your working life. As founder and CEO Drew Houston puts it, he wants Dropbox to move from the filing cabinet to the boardroom.

In Houston’s view, work revolves around content; whether it’s contracts, proposals, timesheets, web pages or blueprints, files are the one constant within every business. With that in mind, your filesystem should be at the heart of your workflow. This is fundamentally what the newly-redesigned Dropbox is all about: using your files as a jumping-off point for collaboration within the organisation.

“We see no shortage of opportunity to help kind of build this workspace that organises itself, that lets you use any of the tools you want to use. But instead of being organised around the concept of messaging, we think the starting point is really around the content,” Houston tells me.

“What are people talking about when they’re in Slack? It’s usually content that lives in Dropbox. Or like Salesforce; the salesperson’s day revolves around content that lives in Dropbox, because they’re getting a proposal together, or they’re getting contracts signed. They’re round-tripping with Dropbox all day. And if instead of having to jump from back and forth, if we can smooth that over, that’s really valuable.”

The app now boarding at Platform One

It’s fundamentally a platform play. Houston wants Dropbox to be the first thing workers open when they get to their desk and the one they come back to most regularly throughout the day, maximising the amount of time spent in the app and minimising time spent outside it. In order to do that, the company is pursuing a number of integrations with other services such as Zoom, Slack and Jira, which will allow users to take actions within those services without actually leaving the Dropbox app.

“A lot of this fragmentation in the end user experience, IT is also experiencing,” Houston says. “We see a big opportunity to help them kind of wrangle all the different tools that their employees are bringing in and help them get some semblance of control, and visibility back.”

“We want to occupy a little bit of different space and people’s minds, from just being a passive content repository to being the living workspace where you get things done. So moving from the filing cabinet to the conference room, where the difference is, you can still have stuff and content, but then you see people and you can have conversations and you can be up on the whiteboard.”

Dropbox is far from the only company doing this; virtually every other business cloud company is pursuing a similar strategy. Salesforce, Box and even Slack itself have been opening up app stores and bolting on integrations left and right in an effort to make their applications as much of a one-stop-shop for their customers’ needs as possible.

Collaboration has also been a key focus for these companies, incorporating messaging, sharing and other communication functionalities into their products. For some, like Microsoft and Google, this takes the form of integrating their storage platforms with full-blown unified comms solutions or collaboration suites like Teams and Hangouts. For others (including Dropbox), it’s comments and activity tracking.

All this is a clear indication that ‘just’ doing cloud-based file storage really isn’t enough of a differentiator any more. Businesses now require a level of added service, whether it’s curating and cataloguing files or helping businesses seamlessly integrate those files into broader workflows, customers expect their storage platform to help take some extra hassle out of modern business.

One big happy family

The benefit of this shift is a potential increase in customer choice and flexibility; you can use Office 365 for its excellent productivity software, but you can also use Dropbox to store your content without being locked into OneDrive, and use Slack for messaging without having to stick to Teams. As these platforms grow in maturity and their integrations grow deeper and more plentiful, businesses will end up with the ability to combine them in whatever way works for them.

On the other hand, this proliferation also runs the risk of increased fragmentation and creep. Without careful management, organisations could easily find themselves with multiple different collaboration and filesharing platforms, all performing nearly identical roles and with content scattered between them. Cost models and licensing also have to be carefully monitored, lest organisations find themselves paying for multiple platforms that fulfil the same role.

These problems can be solved, though. The answer is deep and genuine collaboration between tech companies, such as using machine learning to detect when a file is shared in a Slack channel and automatically uploading it to the relevant Dropbox folder, or Microsoft and Dropbox jointly offering co-branded packages of Office 365 software and Dropbox storage. These ideas might seem unrealistic, but the growing trend towards cooperation and partnership within the tech industry means they’re not as outlandish as they might once have been.

“We’re thinking excitedly about these opportunities,” Dropbox’s group product manager John Hrvatin tells me. “When you’re talking about productivity tools, that data is housed in these silos, and that’s why you end up having to have 10 search boxes. But we’re going through the effort of creating these partnerships with these key partners, because we want to break down those silos. Just the user experience of joining a meeting, or sharing a file, or managing a Trello board [from Dropbox]; just that is user benefit. But search is the other thing that we can fix with these partnerships. And there really isn’t a good way of doing that – unless you build [them].”

Once tech companies figure out a way not just to co-exist but to genuinely collaborate, a world of possibilities opens up. Imagine a system where every single tool or piece of software you use was genuinely and deeply connected; where one search bar allowed you to simultaneously search your cloud storage, emails, hard drive, Slack workspace, Github repos and every other tool besides. Imagine if machine learning and OCR could work together to automatically title, tag and organise files – no more vague titles like ‘proposalV3.doc’ or searching through 18 sub-folders because someone’s mis-filed a crucial document.

These sound like utopian pipe dreams, but it’s all technically feasible. All tech companies have to do is stop trying to beat each other to the finish line and start helping each other – and if Dropbox’s latest pivot is anything to go by, that might happen sooner than we think.

“We’re really at the beginning of that journey,” Dropbox CTO Quentin Clarke says. “But we’re actually uniquely positioned here, because of our neutrality, because we’re so committed to being a system player, we’re going to have access to information – because of these integration with zoom, Slack, etc – that no other players will even have. So our ability to make something good of that using machine learning is actually super high. And that’s something I think you should keep an eye on over the over the coming months and years as we continue to roll out on top of this foundation.”