All posts by James

More organisations moving from proof of concept to initial SD-WAN projects

One in five global companies has implemented an initial software-defined wide area networking (SD-WAN) project, while many more are at the proof of concept stage.

This is according to survey results from Teneo, an ‘as a service’ technology provider. The study, conducted alongside Sapio Research and which polled 200 senior IT and networking managers in the US and UK, found that increasing pressure on company resources and budgets is making companies examine the potential of SD-WAN, with increasing network complexity also cited.

More than a third of organisations’ IT budget is spent with upkeep tasks, according to the report, with another third adding they were using ‘as a service’ models to keep on top of maintenance. What’s more, companies are ‘shrewdly blending connectivity options’ to help beef up their network performance, with 38% of respondents wanting more MPLS, 22% wanting more Internet connectivity, and 20% wanting Internet and MPLS combined.

SD-WAN is being seen as a viable option therefore. 39% of companies polled said they were looking at global networking vendors for their implementations, while 24% are looking at telecoms providers and management consultancies respectively. Only 8% of those polled said they were looking for a specialist SD-WAN vendor.

“Network managers are looking at SD-WAN strategies to run multiple networking environments in standardised ways – whether the underlying motivation is greater simplicity, cost efficiency or transforming critical applications’ performance across their company’s operations,” said Marc Sollars, CTO of Teneo.

“Many firms are clearly putting a toe in the water on SD-WAN, or doing a proof of concept, but it’s still very hard to say when this test phase will start to translate into enterprise-level implementations,” added Sollars. “In many ways, the broad range of choice that SD-WAN brings is what’s causing companies to hesitate over their decisions.”

According to a study from IDC earlier this month, the overall SD-WAN infrastructure market will be worth $4.5 billion by 2022, describing it as ‘one of the fastest industry transformations seen in years.’

Cloudian raises $94 million in series E round, says object storage ‘an idea whose time truly has come’

San Mateo-headquartered cloud storage provider Cloudian has announced it has raised $94 million (£73m) in a series E funding round, hailing it as a validation that object storage is ‘an idea whose time truly has come.’

Cloudian’s business case has been around the benefits of object storage, which provides greater scalability than block storage, as well as favouring unstructured data, with metadata attached to each object more easily identifying and classifying them.

The funding, which for this round included input from Goldman Sachs and NTT DOCOMO Ventures among others, is not quite as simple as x raises y. Back in March, the big headline – as this publication duly reported – was that Cloudian had raised $125 million. Indeed it had – but $100m of that was ‘consumption-based financing’ – a financial buffer, if you will – with $25m as equity. This round can be seen as additional to the $25m already raised. Either way, the company’s funding now stands at $173 million.

As the technological landscape matures, with more enterprises looking to move data into the cloud, the benefits of object storage become even clearer. “We’ve seen this space grow significantly,” Jon Toor, Cloudian CMO tells CloudTech, citing recent IDC figures which showed the global enterprise storage market grew 34% during the first quarter of 2018. “The significant growth in enterprise storage year over year indicates there’s a lot of data moving into data centres. People are looking for new solutions.”

The theory is simple but deadly; as Cloudian is built on cloud technologies, the storage can be anywhere. This comes in especially handy when data is collected in a variety of places.

Michael Tso, CEO of Cloudian, explains that the concept of ‘data gravity’ is key. If data has been created somewhere, for instance in a factory or security camera, it is difficult to move large distances. “The need of having cloud-like storage technology, not in some centralised cloud but near where you’re creating this data, because it’s hard to move a lot of it, is one concept why object storage is really needed,” Tso tells CloudTech.

Case in point: Cloudian has recently picked up two leading Formula 1 teams as customers, which exemplifies this approach well. “They keep everything,” explains Michael Tso, Cloudian CEO. “All the data from the sensors in the car, all the practice runs, from all the races, all the video – that data will never get deleted, they can go back for different things, they can simulate and so forth.

“That’s a very good application for what we do – all the data eventually, no matter where it’s stored or where it comes from, is going to end up on a Cloudian platform because that is the final stop.”

From building a grand total of one car per year – well, two if you include both drivers – to millions, another recent Cloudian customer is a leading automotive manufacturer, in the process of being deployed worldwide. “They are using us to store all of the data for their structure automation, sensors, all the designs – it’s a one stop shop for all their data storage,” says Tso.

Perhaps surprisingly for a US-based cloud software provider, Cloudian has been seeing significant traction in Europe – to the extent where the company now has more customers in the continent, with revenues at approximately 50/50. Tso explains that his vision was ‘the world is flat [and] everything is connected’, and so growth was always intended to be organic. “When we started the company, we put one sales side in Europe, one in the US and one in Asia [and told them] – go for it,” he says. “Whoever made the biggest numbers got to hire the next guy.”

The increased importance of data security in Europe, chiefly thanks to the likes of GDPR, has been another boon. “That’s really I think what is driving the forces behind increased adoption of hybrid IT, where we are a key player,” says Tso. “Being able to bridge this cloud and on-prem, being able to control data flow and data access at a very fine granularity.

“Rather than in the public cloud, which has this approach of one size fits all, we’re seeing data security and privacy moving in different directions,” Tso adds. “The leadership of the European industry and government in that area is really driving our expansion into Europe.”

Earlier this year, the company pinned its mark on three trends driving the rise of object storage in 2018; the growth of artificial intelligence and the Internet of Things, massive data growth, as well as the rise of Amazon S3’s API as an industry standard. It would appear investors agree. “We believe Cloudian is well positioned to dominate the next generation of enterprise storage with its elegantly simple design that integrates both the data centre and cloud environments,” said Edouard Hervey, managing director at Goldman Sachs in a statement.

Read more: Cloudian: On acquiring Infinity Storage, multi-cloud and machine learning

VMworld 2018: Multi-cloud strategies, AWS partnership blossoms, vSAN and NSX updates, and more

At VMworld in Las Vegas, VMware CEO Pat Gelsinger, with the help of some of the best and brightest in the cloud industry, expanded on partnerships and products, as well as the evolution of the multi-cloud landscape.

For the second year running, Andy Jassy, CEO of Amazon Web Services (AWS), took to the stage to give an update on AWS’ growing partnership with VMware, alongside new features. The announcement of an expansion of VMware Cloud on AWS to Asia-Pacific was good – Jassy told the audience the service would be ‘largely’ across all regions, including GovCloud, by late 2019 – but even better was the announcement of Amazon Relational Database Services (RDS) on VMware.

“You’ll be able to provision databases, you’ll be able to scale the compute, or the memory, or the storage for those database instances, you’ll be able to patch the operating system or the database engines,” said Jassy. “I think it’s very exciting for our customers and I think it’s also a good example of where we’re continuing to deepen the partnership and listen to what customers want, and then innovate on their behalf.”

The service, which will be available in a few months, aims to take the capabilities of setting up relational databases in the cloud, but manage them on VMware’s on-premises environment. If users decide these databases would be better longer-term on AWS, then there is a smooth migration path in place.

Jassy noted MIT as a key customer on the primary use case of VMware Cloud on AWS – migrating on-premises applications to the cloud. The university has been able to migrate 3000 VMs from their data centres to the companies’ solution, taking only three months to do so.

The partnership between VMware and AWS, first announced two years ago and updated last year, is evidently blossoming. So much so that VMware half-borrowed a concept from a previous AWS keynote for its own. Whereas Jassy framed his 2016 re:Invent speech on superpowers – supersonic speed, immortality, x-ray vision – and how AWS seemingly enables them, Gelsinger focused on tech superpowers – cloud, mobile, AI/ML, and edge/IoT.

More importantly, while these technologies are changing the way we live and work, they work even better in tandem. “We really see that each one of them is a superpower in their own right, but they’re making each other more powerful,” said Gelsinger. “Cloud enables mobile connectivity, mobile creates more data, more data makes the AI better, AI enables more edge use cases, and more edge requires more cloud to store the data and do the computing.

“They’re reinforcing each other – these superpowers are reshaping every aspect of society.”

VMware’s long-standing vision, therefore, of ‘any device, any application, any cloud’, with intrinsic security, plays into this. Much was spoken about the VMware Cloud Foundation, which the company sees as being “the simplest path to the hybrid cloud”, as Gelsinger put it. The quickest way to get there is through hyperconverged infrastructure, to which VMware has announced an update to vSAN, to ease adoption through simplified operations and efficient infrastructure. The figures touted at the keynote – more than 15,000 customers and 50% of the Global 2000 – are impressive; Gelsinger said it was “clearly becoming the standard for how hyperconverged is done in the industry.”

Another product announcement, this time multi-cloud flavoured, came in the form of upgrades to the VMware NSX networking and security portfolio. According to the company’s earnings call last week, more than four in five of the Fortune 100 have now adopted NSX. NSX-T Data Center 2.3, which is expected to be available before November, will again aim to give customers greater ease of deployment, as well as extend multi-cloud networking and security to AWS and Microsoft Azure, ‘empowering customers that operate across multiple public clouds to take advantage of local availability zones and the unique services of different cloud providers’, as the press materials put it.

One more multi-cloud themed piece of news was that VMware had acquired Boston-based cloud service management provider CloudHealth Technologies. CloudHealth offers a variety of capabilities in its platform, from streamlined billing to scaling out, enabling organisations to manage their workloads across AWS, Microsoft Azure, Google Cloud Platform and VMware.

The company said last June, on the raising of its series D funding, that the IPO market was something it would watch as it aimed to ‘become the anchor company at the centre of the Boston software technology ecosystem for decades to come.’ Earlier this year CloudHealth announced ‘significant investments’ into Europe. Writing in a blog post, founder and CTO Joe Kinsella said that he ‘was gratified to learn early on in discussions that VMware and CloudHealth Technologies share a most important strand of corporate DNA: customer-first.’

“As part of VMware, we will be able to serve you better and offer you a richer set of choices to support your business transformation in the cloud,” Kinsella added.

Ultimately, Gelsinger sees multi-cloud as ‘the next act’ in VMware’s 20-year history, after the server era, BYOD, the network, and cloud migration. The VMware CEO cited a survey from Deloitte which stated the average business today was using eight public clouds.

“As you’re managing different tools, different teams, different architectures – how do you bridge across?” he asked. “This is what we will do in the multi-cloud era – we will help our community to bridge across and take advantage of these powerful cycles of innovation that are going on, but be able to use them across a consistent infrastructure and operational environment.”

You can check out the full list of VMworld news here.

Picture credits: VMware/Screenshot

Cloud hyperscaler capex broke $53 billion for the first half of 2018, says Synergy Research

The capital expenditure of the largest cloud infrastructure players continues to rise – and according to Synergy Research, the first half of 2018 has seen record figures being published.

Total capex for the first half of this year among the hyperscale operators hit $53 billion (£41.1bn), compared with $31bn this time last year. Q2’s figures did not quite match Q1, but this is down to an anomaly, Synergy argues, with Google confirming its buying of Manhattan’s Chelsea Market building, for $2.4bn, in March.

The top five spenders remain Google, Microsoft, Facebook, Apple, and Amazon – and have been for the past 10 quarters. Between them, these five companies account for more than 70% of hyperscale capex. Only in the third quarter of 2016 did overall spending break $15bn (below), with a particular ramp over the past 12 months.

It is worth noting too that the list of highest level players has been trimmed, from 24 to 20. Synergy explains that this is down to a variety of factors; from some companies being subsumed into others’, such as LinkedIn, to others not spending enough on capex to justify inclusion. In some cases, this is because they are moving more of their workloads onto AWS and Azure, to the detriment of their own data centre footprint.

Regardless, this is yet another indicator that the largest players in cloud infrastructure are not resting on their laurels. As this publication has reported, Google has looked to expansion in Finland and Singapore in the past three months, while Microsoft, in an experimental move, put a data centre underwater off the Orkney Islands.

“Hyperscale capex is one of the clearest indicators of the growth in cloud computing, digital enterprise and online lifestyles,” said John Dinsdale, a chief analyst at Synergy. “Capex has reached levels that were previously unthinkable for these massive data centre operators and it continues to climb.

“The largest of these hyperscale operators are building economic moats that smaller competitors have no chance of replicating,” Dinsdale added.

Cloud adoption in EMEA continues healthy growth – but security not on the same page

Cloud adoption in EMEA continues to grow significantly, but security awareness is not growing with it, according to new research from Bitglass.

The cloud access security broker (CASB), in its latest report, used company domains to identify cloud apps deployed in 20,000 organisations across Europe.

Less than half (47%) of organisations analysed had a single sign on (SSO) tool in use, according to the data; while this is higher than the one in four companies globally who use it, it still means serious room for improvement, according to Bitglass. SSO was seen as most widely adopted in education – as with 64% of respondents – biotech (54%), healthcare (53.7%) and finance (53.5%).

Practically every EMEA organisation analysed had deployed more than one cloud app, with many having at least a productivity app, file sync and share and cloud messaging platforms, alongside infrastructure as a service (IaaS).

When it came to AWS, adoption in EMEA is far exceeding global usage rate. Worldwide, only 13.8% use AWS, compared to 21.8% for EMEA. “Many firms in EMEA are also early adopters, willing to try new methods of custom app deployment like AWS that seem promising and are growing rapidly.” In terms of software, Office 365 continues to outpace G Suite, with 65% and 19.2% adoption in 2018 a marked change from 2016 (43% and 22% respectively).

“The results of this survey reinforce what we found in our 2016 study,” said Rich Campagna, CMO at Bitglass. “Organisations in EMEA are embracing cloud productivity apps but still lack the security tools necessary to protect data.

“In cloud-first environments, security must evolve to protect data on many more endpoints and in many more applications,” added Campagna.

According to another piece of research issued by CenturyLink this week, the cloud computing market is forecast to reach $411 billion by 2020. In Germany, the cloud services segment in Germany alone – defined as SaaS, PaaS and IaaS – is predicted to hit more than $20bn.

Diversity, transferable skills and upgrading skill sets: The keys to cloud employability in 2018

There are plenty of new jobs being created in the cloud sector – but organisations need to think long-term around securing the cloud skills their business needs.

That is according to a new report from UK technical recruiter Experis. The company, whose latest Tech Cities Job Watch focuses on the cloud sector, explores the skills required as well as the employers who may be looking for them.

According to the report, Google offers the most attractive salary out of the four biggest cloud providers from the Q2 data, with £71,701. This is just ahead of AWS (£70,090), followed by IBM (£68,251) and Microsoft Azure (£64,647). For contractors, AWS, with £502 per day, has the best rates, ahead of IBM (£492), Azure (£471) and Google (£445).

The vast majority of roles advertised in the UK however come from the big two. AWS, with more than half (54%) of all jobs advertised, and Azure (41%) made up 95% of postings, with Google (4%) and IBM (1%) severely trailing.

The report gives five takeaways when it comes to businesses looking for the right cloud skills; be willing to offer attractive remuneration; offer the opportunity to upgrade existing skillsets; cross-train your staff, with SQL and MySQL cited as transferable skills; consider outsourcing; and address the diversity imbalance.

This publication has long-since explored the problem of the cloud skills gap, with various reasons blamed as to why supply outstrips demand. According to a study from Skytap published in June, organisations can be ‘their own worst enemy’ in this regard, with many wanting to migrate to the cloud through refactoring or rewriting existing applications – thus requiring the highest level of IT skill.

According to Firebrand Training technical writer Alex Bennett, security, machine learning and artificial intelligence (AI), and serverless architectures are among the key skills employees need in 2018 to be noticed – but also noting the cross-sector skill sets of Experis. “The key to employability in today’s cloud jobs market is to gain cross-platform skills,” Bennett wrote. “By transferring your knowledge between cloud platforms, you’ll diversify your skillset and boost your employability in 2018.”

“In spite of the difficulties, business models are becoming increasingly reliant on the cloud to function,” wrote Dave Hannah, Experis brand leader in his foreword to the report. “As a result, demand for cloud talent is soaring, as employers recognise that they need to recruit specialists to lead adoption and integration.

“Since cloud adoption is only set to increase in the years ahead, we can expect the war for talent to intensify,” added Hannah. “In a market like this, it’s important that organisations take a long-term view of how they will stay ahead of the competition and secure the skills their business needs.”

You can read the full Experis report here (email required).

Google Cloud gets up to speed with AWS and Azure with launch of HSM crypto tool

Google Cloud has announced the launch of a managed cloud-hosted hardware security module (HSM) service – joining Amazon Web Services and Microsoft Azure in this security benchmark.

The Cloud HSM will enable customers to host encryption keys and perform cryptographic operations in FIPS 140-2 Level 3 certified HSMs, according to a company blog post.

To put this in perspective, the highest level for the FIPS 140-2 standard is Level 4, which aims to “provide a complete envelope of protection around the cryptographic module with the internet of detecting and responding to all unauthorised attempts at physical access.” Level 3, instead, requires “a high probability of detecting and responding to attempts at physical access, use or modification of the cryptographic module.”

Cloud HSM is tightly integrated with Google’s Cloud Key Management Service (KMS), which enables data protection in services such as BigQuery, Google Compute Engine, Google Cloud Storage and DataProc with a hardware-protected key.

The move came about, according to product manager Il-Sung Lee, because customers wanted more options to protect sensitive information and meet compliance mandates. This is despite Google claiming to be the only cloud provider that encrypts all customer data at rest.

“For those of you managing compliance requirements, Cloud HSM can help you meet regulatory mandates that require keys and crypto operations be performed within a hardware environment,” wrote Lee. “In addition to using FIPS 140-2 certified devices, Cloud HSM will allow you to verifiably attest that your cryptographic keys were created within the hardware boundary.”

Some may consider that this has been a long time coming for Google; Microsoft announced Azure Key Vault, a cloud-hosted HSM-backed service for managing cryptographic keys, as far back as the start of 2015. AWS’ CloudHSM tool is also widely documented.

Yet Google’s cloud operations have certainly been innovative elsewhere of late. Earlier this month the company announced the launch of pre-packaged AI services, around contact centres and talent acquisition, as well as supporting NVIDIA’s Tesla P4 GPUs, for graphics-intensive and machine learning applications.

Find out more about Google Cloud HSM beta here.

Quarterly SaaS spending hits $20 billion – with Microsoft continuing to hold a solid lead

Microsoft continues to hold a solid lead in the enterprise SaaS market ahead of Salesforce – but even though the market can be considered mature there is plenty more potential to come.

That is the key finding from the latest note by Synergy Research. The analysis, which comes from Q2 data, notes that the enterprise SaaS market is new generating $20 billion in quarterly revenues for vendors. Microsoft holds 17% of the overall market, with Salesforce the nearest contender and the only other company to hold more than 10% share.

Microsoft is also the fastest growing company with 45% annual growth. Oracle, with just over 5% overall share, saw 43% growth, with SAP (36%), Adobe (32%) and Salesforce (25%) all solid.

While everyone is seeing solid growth, it also plays into the fact that the market is somewhat fragmented with different areas for growth. Per the company’s figures from this time last year, Microsoft, Cisco and Google dominate collaboration, with Salesforce, Microsoft and Zendesk in CRM and Oracle, SAP and Infor for ERP.

“There is a fascinating battle for SaaS playing out, with traditional enterprise software vendors slugging it out with born in the cloud vendors like Workday, Zendesk, ServiceNow and Dropbox,” said John Dinsdale, a chief analyst at Synergy. “The latter group are helping to rapidly transform the market, but the more traditional players like Microsoft, SAP, Oracle and IBM still have a huge base of on-premise software customers that they can convert to a SaaS-based consumption model.

“Meanwhile Cisco and Google too are making ever-bigger inroads into the SaaS market, via Cisco’s collaboration apps and software vendor acquisitions and Google’s G Suite,” added Dinsdale.

Looking at the most recent figures, IBM last month announced a third consecutive quarter of revenue growth, with cloud revenue up 20% and representing almost a quarter of the company’s total revenue. From SAP’s perspective, the company raised its 2020 outlook, having previously praised ‘stellar cloud bookings’ earlier this year.

DigiPlex aims to reuse waste data centre heat in Oslo apartments with new partnership

As technology continues to improve, so too does the responsibility of infrastructure providers in ensuring an environmentally-friendly future. Nordic data centre firm DigiPlex has announced a scheme whereby waste heat from its facilities will be reused in residential apartments across Oslo.

The company signed a letter of intent with Fortum Oslo Varne, Norway’s largest district heating supplier, to see to the needs of approximately 5,000 apartments across the Norwegian capital.

DigiPlex insists, in its own words, that ‘a progressive data centre industry must do what it can to reduce its environmental footprint.’ “We are proud to reinforce our leading role in our industry regarding climate change, using renewable power and the waste heat from our data centre at Ulven to keep the citizens of Oslo warm,” said Gisle M. Eckoff, DigiPlex CEO.

“Digitisation must move towards a greener world, and our cooperation with Fortum Oslo Varme is an important step in that direction.”

This is by no means the only initiative being undertaken for a greener industry. Eyebrows may have been raised in June when Microsoft unveiled Project Natick, whereby a data centre was placed underwater, off the Orkney Islands, to provide naturally cooler temperatures. Yet while it’s worth noting it is at the experimental stage right now, some struggle to see value in the project.

Writing for this publication, Joseph Denne, founder and CEO of DADI, argued: “It’s hard to believe that this is a realistic option for the future. “Being surrounded by seawater might keep the temperature of the hardware under control without requiring the specialist cooling systems used in conventional server farms, but it also makes servicing a faulty node pretty much impossible, and a lot of energy has to go into making the thing in the first place,” wrote Denne.

“Surely it makes much more sense to maximise the potential of the devices we already have at our disposal, which would otherwise be idle for around three-quarters of their lifetime.”

Of course, much of the innovation is in the Nordic regions where naturally cooler temperatures can be fed in and taken advantage of without greater energy output. Alongside the Norwegian partnership, DigiPlex also has initiatives in place for both Sweden, with district heating provider Stockholm Exergi, and Denmark. Heatwaves aside, the temperate UK can also benefit from this, with Rackspace’s UK data centres being among those with these features built in.

Despite this, the past year has felt as though environmental efforts are being stepped up. Back in April, Google announced it had hit its 100% renewable energy targets, claiming to be the first public cloud provider to do so.

Read more: A data centre with no centre: Why the cloud of the future will live in our homes

The future of enterprise software: Big data and AI rules okay – and the ‘decentralisation of SaaS’

Machine learning, cloud-native and containers are going to be key growth drivers of the future enterprise software stack – but it could be the end of the road for software as a service (SaaS).

That’s the verdict from an extensive new report by venture capital fund Work-Bench. The full 121-slide analysis (Scribd), titled ‘The Enterprise Almanac: 2018 Edition’, aims to dissect a ‘once in a decade tectonic shift of infrastructure’, focusing on the new wave of services that will power the cloud from the end of this decade onwards.

“Our primary aim is to help founders see the forest from the trees,” wrote Michael Yamnitsky, report author and VC at Work-Bench. “For Fortune 1000 executives and other players in the ecosystem, it will help cut through the noise and marketing hype to see what really matters. It’s wishful thinking, but we also hope new talent gets excited about enterprise.”

If this analysis is anything go by, there will be plenty to get excited about in the coming years.

Machine learning

Large technology companies are winning at AI, Work-Bench asserts. And why not? This publication has devoted plenty of column inches in recent months to how among the hyperscalers are using artificial intelligence and machine learning as a differentiator – indeed, Google Cloud this week launched pre-packaged AI services to try and stay one step ahead of the competition.

It’s not so much of a differentiator if everyone’s getting in on the act, though. And this is where others are struggling. “Despite hopeful promise, startups racing to democratise AI are finding themselves stuck between open source and a cloud place,” the report notes.

It’s a data-driven world, of course – but the disconnect between the ever-increasing amounts of data being crunched and the data scientists available to crunch it is clear. And this is where the Googles, Facebooks, Microsofts and Amazons of this world are again at an advantage – by hoovering up most of the AI talent.

Those who are making strides outside of the behemoths, however, are startups focusing on automated machine learning (AutoML). The key, instead of beating Amazon and Google at their own games with SageMaker, TensorFlow et al, is to focus their products and messaging on BI analysts (above). Companies such as Tableau have got data visualisation nailed – but about getting reports in natural language, or ascribing even greater insights? To illustrate this perfectly, Tableau acquired Empirical Systems, an MIT-originated AI startup, in June for this very reason.

“Expect all modern BI vendors to release an AutoML product or buy a startup by [the] end of next year,” Work-Bench concludes.

Cloud-native

Writing for this publication earlier this week, Jimmy Chang, director of products at Workspot, discussed the frustrations of terms such as ‘cloud-native’ and ‘cloud-enabled’ being interchangeable. Being in the virtual desktop business, Chang uses an example from his own industry: only two of the VDI players in the market have genuinely cloud-native products.

It’s important therefore to determine what’s what without the risk of cloud washing. For Work-Bench, it begets an exploration of cloud infrastructure and software from Amazon Web Services, Microsoft Azure and Google Cloud Platform – a subject which is always good to analyse at the end of each quarter, as regular CloudTech readers will testify.

The Work-Bench analysis certainly makes sense from here. AWS is entrenched as #1, Microsoft at #2 for now, and Google at #3, in spite of the latter two’s continued momentum. ‘Killer products… but where’s the enterprise love?’, the report asks of Google.

The majority of organisations continue to struggle with containerising applications and have three key strategies, the report notes. The first strategy is ‘monocloud’ – think Ryanair, GoDaddy – where companies go all-in on the provider of choice. The second is a price broker model with workloads run wherever they are cheapest – Kubernetes is seen as a key tool here for those who have gotten to grips with it – and the third is a function broker model with different clouds for different workloads. Remember the brouhaha when it was revealed long-time AWS house Netflix was running disaster recovery workloads on Google – an arrangement the company stressed had been going on for a while? It’s on its way – and makes good business sense when applicable.

The report also bows to the king of container orchestration in Kubernetes; despite struggles it has a clear market lead, with half of enterprises using containers in some capacity according to 451 Research. But Work-Bench asserts the puck is heading towards the service mesh, a configurable infrastructure layer for microservices applications offering load balancing, encryption, authentication and more. Security will be the killer use case going forward. “Service meshes are like broccoli… you know you need them but only adopt when you feel the pain of not having them,” the report says.

The decentralisation of SaaS

This is arguably the most interesting punt in the report: as software as a service (SaaS) ate infrastructure, infrastructure will go back and eat SaaS.

According to IDC’s most recent figures, software as a service spending globally was at $74.8bn, almost three times the size of infrastructure as a service ($24.9bn). By 2022, IDC predicts SaaS spending to be ahead of SaaS, IaaS and PaaS combined at $163bn.

But the biggest players could get too big for their boots (above), as the report explains. “SaaS vendors are becoming mighty and taking advantage of it – using aggressive tactics to expand dollar share within existing accounts, often by shoving excessive features and extensive contract terms down customers’ throats,” the report notes. “Customers have no choice but to succumb to these closed-ecosystem tactics.”

The reasoning goes back several years and further: as SaaS provided good economic sense when running infrastructure was expensive and configuration was difficult, the pendulum with cloud computing has swung.

The report adds that there is one solution: containers. If enterprises are struggling with them today then they will need to act fast, as in the opinion of Work-Bench it doesn’t quite fit in with SaaS customisation. “In a world where services written in different languages can easily communicate, proprietary languages that require hiring ‘experts’ will be obsolete.”

The empire strikes back

The report focuses on the return of the big traditional enterprise software players as an introduction – but it can also be seen as an overarching sentiment of the industry today.

Tellingly, two of the largest software acquisitions over the past six years were closed in the last six months. This is not so much in terms of the amount of money spent – although $7.5bn and $6.5bn respectively were shelled out for GitHub and MuleSoft by Microsoft and Salesforce respectively – but by dividing enterprise value by trailing 12 month revenue.

As venture capitalist Tomasz Tonguz points out, comparing the Microsoft/GitHub deal (24.5 EV/TTV) and Salesforce/Microsoft (21.2) with, for instance, Microsoft’s acquisition of LinkedIn (6.8) and Cisco’s buy of Broadsoft (5.9) shows much greater value with this year’s buys.

“I expect substantially more acquisitions of the scale and at these multiples through 2018,” Tunguz wrote back in June when disclosing these figures. “The growing sizes of the software market. The desire for continuing growth. The pace of innovation within software. The increasing competition amongst incumbents. A vibrant public market that is continuing to price companies aggressively.

“It’s a great time to sell a fast growing billion-dollar company.”

You can look at the full slides here.

Main pictures credit: Work-Bench