Cloud services and infrastructure spending breaks $150bn in six months, says Synergy

While spending across cloud infrastructure may be suffering something of a minor blip, cloud services spending appears to be shoring things up.

The latest analysis from industry analyst Synergy Research shows that, for the first half of 2019, operator and vendor revenues broke $150 billion, at a rise of 24% year on year.

Infrastructure as a service (IaaS) and platform as a service (PaaS), led naturally by the hyperscalers of Amazon Web Services, Microsoft Azure and Google Cloud, was the fastest growing segment at over 40%, while hosted private cloud, led by IBM, Rackspace and NT, grew at over 20% year on year. When it came to cloud-based software, such as enterprise SaaS and unified comms as a service (UCaaS), growth was in the 25% range yearly.

In aggregate, Synergy noted, spending on cloud services was now ‘far greater’ than spending on supporting data centre infrastructure. Despite this, areas such as public and private cloud infrastructure hardware and software, as espoused by Dell EMC, HPE et al, grew at just over 10% year on year.

“Cloud is increasingly dominating the IT landscape,” said John Dinsdale, a chief analyst at Synergy. “Cloud has opened up a range of opportunities for new market entrants and for disruptive technologies and business models. Amazon and Microsoft have led the charge in terms of driving changes and aggressively growing cloud revenue streams, but many other tech companies are also benefiting.

“The flip side is that some traditional IT players are having a hard time balancing protection of legacy businesses with the need to fully embrace cloud,” Dinsdale added.

Synergy issued a note last month which found hyperscaler capex was down 2% based on year-by-year figures. While the most recent quarter saw more than $28 billion in spending, a primary cause was seen to be China’s expenditure declining by 37% year on year.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

IBM’s Quantum Cloud offers access to the ‘single largest quantum computer system’


Bobby Hellard

19 Sep, 2019

IBM has announced the opening of a Quantum Computer Centre in New York that will provide quantum computing over its cloud network.

The centre will be home to the tech giant’s 14th quantum computer, a 53-quantum bit, or qubit, model that will form the data-processing element of the service.

IBM said this will be the single largest quantum computer system available for external access. For context, Google has a 72-qubit computer, but, so far, hasn’t let outsiders run programs on it.

Despite the technology still being largely experimental, IBM has already worked on a number of potential case studies with major clients. According to Dario Gil, director of IBM Research, the firm’s strategy is to move quantum computing beyond isolated lab experiments and into the hands of tens of thousands of users.

«In order to empower an emerging quantum community of educators, researchers, and software developers that share a passion for revolutionising computing, we have built multiple generations of quantum processor platforms that we integrate into high-availability quantum systems,» he said.

«We iterate and improve the performance of our systems multiple times per year and this new 53-qubit system now incorporates the next family of processors on our roadmap.»

To start, ten quantum computer systems have been put online through IBM’s Quantum Computer Center. Its fleet is now composed of five 20-qubit systems, one 14-qubit system and four 5-qubit systems. Five of these systems now have a Quantum Volume of 16 – a measure of the power of a quantum computer – demonstrating a new sustained performance milestone.

In the next month, this portfolio of quantum computers will grow to 14 systems including the new 53-qubit quantum computer.

Earlier this month IBM announced a partnership with applied research organisation Fraunhofer Gesellschaft to study quantum computing in Germany. The tech giant hopes to be a hub in the country as the technology accelerates.

What’s more, IBM is already working on potential use cases with partners, such as bank J.P. Morgan Chase, which has proposed a quadratic speedup algorithm that could allow financial analysts to perform option pricing and risk analysis in near real-time.

The tech giant is also working with Mitsubishi Chemical to develop a quantum computing process to understand the reaction between lithium and oxygen in lithium-air batteries, with the hope that it could lead to more efficient batteries for mobile devices and cars.

No Mickey Mouse Microsoft migration: Walt Disney Studios utilising Azure for content workflows

Walt Disney Studios is looking to the cloud for new ways to create, produce and distribute its content – and the media behemoth has chosen Microsoft to help.

The companies have signed a five-year ‘innovation partnership’ which will see Disney utilise Microsoft’s Azure cloud platform to ‘help accelerate innovation at The Walt Disney Studios for production and post-production processes’ – or ‘scene to screen’, as the companies put it.

The partnership will be concentrated around Disney’s StudioLAB, a technology hub focused on ‘the art of storytelling with cutting-edge tools and methods’, including virtual reality (VR) and artificial intelligence (AI).

There is a third partner here in the shape of media technology firms Avid, with whom Microsoft already has a cloud alliance focused on putting together cloud-based media workflows around active backup, collaborative editing, and content archiving. The companies are ‘demonstrating that the kinds of demanding, high-performance workflows the media and entertainment industry requires can be deployed and operated with the security offered by the cloud’.

The latter is a particularly important use case for media providers; NASCAR’s move to Amazon Web Services (AWS) back in June enabled it to launch an online archive feature, while Boston-based TV station WGBH utilised object storage provider Cloudian last year to dramatically reduce the time required to access its previously tape and disk drive-oriented archive.

“By moving many of our production and post-production workflows to the cloud, we’re optimistic that we can create content more quickly and efficiently around the world,” said Jamie Voris, Walt Disney Studios CTO. “Through this innovation partnership with Microsoft, we’re able to streamline many of our processes so our talented filmmakers can focus on what they do best.”

Kate Johnson, president of Microsoft US, said cloud usage has ‘reached a tipping point’ for the media industry. “With Azure as the platform cloud for content, we’re excited to work with the team at StudioLAB to continue to drive innovation across Disney’s broad portfolio of studios,” said Johnson.

This is by no means the first cloud initiative from the wider company; back in 2017, just in time for re:Invent, it was announced that The Walt Disney Company was utilising AWS as its preferred public cloud infrastructure provider.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Oracle wants to say goodbye to shared responsibility by ramping up autonomous next-gen cloud approach

The concept of shared responsibility in cloud computing is one which continues to be, if not so much a rancour, an ongoing concern. In February, for instance, research from Check Point Software found that, for more than a quarter of respondents, cloud security was the provider’s responsibility.

All clouds differ to a degree, but a safe bet is that the provider will take charge of security of the cloud – making sure zones remain available, the infrastructure works and so on – while the customer takes care of security in the cloud. This ranges from applications to network and firewall configuration, but most importantly, it also means security of the customer’s data.

Oracle is looking to make this a thing of the past with its autonomous database, as the company outlined at OpenWorld in San Francisco this week. While the autonomous database is something which has been long-discussed, the implications for cloud security were interesting, as CTO Larry Ellison explained.

“Amazon takes what I think is a very reasonable position,” Ellison told delegates. “You misconfigured the system, that’s your mistake, we at Amazon can’t be held responsible. If you spend the night drinking and then get into your Ford 150 and crash it, that’s not Ford’s problem.”

For regular viewers of Ellison’s keynotes, it won’t surprise that this was about the only positive word said about the Seattle giant all hour. And as Ellison continued the Ford analogy, an autonomous Tesla – where the Oracle CTO sits on the board as of the start of this year – would drive you home safely.

“Amazon’s support policy is very clear,” said Ellison. “As a customer, you maintain full control of your content and responsibility for configuring access to AWS services. That’s on you. In the AWS cloud, if you make an error, and it leads to catastrophic data loss, that’s on you. In the Oracle cloud, the database automatically provisions itself, it automatically encrypts itself, backs itself up, all the security systems are automatic.

“The generation two cloud, the autonomous database is responsible for preventing user errors; the system is responsible for preventing data loss, not you,” Ellison added. “Us – or more precisely, our automated systems.”

More comparisons with Amazon arrived when Ellison touting the ‘convergence’ features of Oracle’s autonomous database. Whereas the smartphone became an all-in-one covering cameras and calendars among dozens more, can’t there be one database to rule them all? Oracle thinks so: its offering can be a relational database, in-memory, support JSON, and even support machine learning and blockchain, Ellison claimed.

It’s autonomous almost-everything at Oracle towers these days, with the launch of Autonomous Linux high on the priority list – if a long time in coming. Oracle’s version of Linux, which Ellison said the company has been working on for two decades, is now being claimed as the first autonomous operating system in the world. Ellison compared this with IBM and Red Hat’s offering, noting that in 15 years not one single Red Hat incompatibility bug has been filed with Oracle’s Linux offering. The press materials described this as ‘a major milestone in the company’s autonomous strategy’.

Some aspects of the keynote went against the grain. Ellison gave an update on Oracle’s cloudy partnership with Microsoft, saying the latter had ‘a lot of good technology.’ The collaboration, first announced in June as one of the more surprising stories of the year, found the two companies coming together to connect Azure and Oracle Cloud data centres seamlessly. As this publication noted at the time, two of the three customers mentioned were Albertsons and Gap Inc – with retailers being a cause where Microsoft and Oracle can team up against a common enemy.

Another for the eyebrow-raising category was the launch of Oracle Cloud Free Tier, where organisations of any size, developers, and students can ‘build, learn and explore the full functionality of Oracle Autonomous Database and Oracle cloud infrastructure’, as the company put it. Ellison frequently noted that there was little to learn in Oracle’s autonomous database as, well, it does everything itself, but the ‘always free’ element is intriguing.

In total, users can utilise two Oracle autonomous databases of their choice, with 1 OCPU and 20 GB of storage capacity, two block storage volumes at 100 GB total, and 10 GB of object and archive storage, and two virtual machines with 1/8 OPCU and 1GB of memory each, alongside a smattering of extras.

Oracle also promised to launch 20 new cloud regions by the end of 2020, making a total of 36. The press materials indicated that at least part of this number will be driven by the Microsoft interconnect project, which is also being expanded, with a grand total of 14 countries cited as expansion zones. These are, in alphabetical order, Australia, Brazil, Chile, India, Israel, Japan, Netherlands, Saudi Arabia, Singapore, South Africa, South Korea, the United Arab Emirates, UK and US.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Disney takes to the cloud with five-year Microsoft Azure deal


Bobby Hellard

17 Sep, 2019

Microsoft and The Walt Disney Studios have announced a five-year partnership to pilot new ways to create, produce and distribute content on the Azure cloud platform.

The aim of the partnership is to help accelerate production and post-production processes and bring more Disney content from «scene to screen».

The current landscape for the film industry is heavily in the shadows of Disney, with it relentlessly releasing Marvel movies and ‘live’ remakes of its own back catalogue.

What’s more, the company is launching a streaming service to take on the likes of Netflix, called Disney+, and now has greater demand to create more of its own TV shows.

«By moving many of our production and postproduction workflows to the cloud, we’re optimistic that we can create content more quickly and efficiently around the world,» said Jamie Voris, CTO, The Walt Disney Studios.

«Through this innovation partnership with Microsoft, we’re able to streamline many of our processes so our talented filmmakers can focus on what they do best.»

Disney’s StudioLAB, a technology hub designed to create and advance the future of storytelling with cutting-edge tools and methods will run with Azure to help speed up content processes.

However, the partnership will also have a third-party supporting it, with global media tech firm Avid working closely with both companies. Avid already has a «strategic alliance» with Microsoft, working on media workflows in Azure, including collaborative editing, content archiving, active-backup and production continuity.

«The cloud has reached a tipping point for the media industry, and it’s not surprising that The Walt Disney Studios, which has its heritage based on a passion for innovation and technology, is at the forefront of this transformation,» said Kate Johnson, president of Microsoft US.

«With Azure as the platform cloud for content, we’re excited to work with the team at StudioLAB to continue to drive innovation across Disney’s broad portfolio of studios.»

Salesforce launches Manufacturing Cloud and Consumer Goods Cloud


Bobby Hellard

17 Sep, 2019

Salesforce has launched cloud services targeted at manufacturing and consumer product goods companies as part of its ongoing efforts to take on SAP and Oracle.

The SaaS specialist is aiming to bring ground-level teams, sales and operations, closer together for the benefit of the customer.

In order for manufacturers to provide a seamless customer experience, they need something that helps them better understand customer needs, according to Cindy Bolt, SVP and GM at Salesforce Manufacturing. What’s more, they need to do so while improving visibility across the entire business, from logistics to marketing. 

«In the manufacturing industry, changing customer and market demands can have a devastating effect on the bottom line, so being able to understand what is happening on the ground is imperative for success,» she said.

«Manufacturing Cloud bridges the gap between sales and operations teams while ensuring more predictive and transparent business, so they can build deeper and more trusted relationships with their customers.»

The aim of Manufacturing Cloud is to address challenges around predicting demand and managing warehouse costs. It does this by collating sales agreements and forecasting tools, potentially enabling sales, operations and accounts teams to generate stronger projections.

The Consumer Goods Cloud, though, is aimed at field reps working from more brick and mortar businesses. For this, it is equipped with tools intended to streamline store operations by keeping stock, pricing and promotional information aligned with the business expectations.

«Retail execution remains one of the most important pieces of a consumer goods brands strategy, but so much opportunity is wasted if the field rep doesn’t have the data and technology needed to make smart decisions,» said Salesforce retail and consumer goods GM and SVP John Strain. «Consumer Goods Cloud provides these field reps with the tools they need to be successful on the ground while helping build both business opportunities and stronger relationships with their retail partners.»

Oracle takes the wraps off world’s first autonomous operating system


Maggie Holland

17 Sep, 2019

Oracle has trumped its Autonomous Database concept by unveiling the world’s first fully autonomous operating system, taking heavy aim at rivals Amazon and IBM in the process. 

Dubbed Oracle Autonomous Linux, the new OS is available immediately and follows on the self-management and self-securing path as the database that the tech giant launched back in 2017 to great fanfare. 

“Our version of Linux, which we have been working on for almost 20 years, is now autonomous. It is the first and the only autonomous OS in the world. And it’s live,” Ellison told delegates during his opening keynote at Oracle OpenWorld in San Francisco today.

It’s a highly available system designed for the cloud. It patches itself while it’s running. You discover a vulnerability we fix it. There’s no downtime, no delay. We fix it while it’s running. 

“It drives itself. There’s nothing to do because it drives itself. It does all of this stuff automatically. You can concentrate on building systems that are related to your biz rather than worrying about the underlying plumbing. When you use Oracle autonomous OS in the cloud, the price is just right. It’s free. So, if you’re paying IBM, you can stop!”

The idea behind both the autonomous database and OS is to eliminate the room for human error and, in doing so, make organisations’ large-scale cloud environments far more secure. 

“Autonomous systems eliminate human labour. And, when you eliminate human error, you eliminator pilot error. If you eliminate pilot error on a database, you eliminate data theft. And that – as far as I know – is the only way you can ever eliminate data theft,” Ellison added. 

Citing the Capital One data breach, Ellison alluded to the fact it wouldn’t have been possible using its technologies, saying: “With Oracle Autonomous Database, it’s not possible for customers to configure – there are no pilots to make errors.”  

He added: “In AWS’ cloud if you make an error and it leaves a catastrophic data loss that’s on you… With Oracle Autonomous Database, the system is responsible for preventing data loss, not you.

”Put your data in an autonomous system. There’s no human labour, no human error, no data loss. That’s a big difference between us and AWS.”

Oracle’s latest move will resonate with organisations looking to solve their OpEx challenges, according to Al Gilles, group vice president of software development and open source at analyst firm IDC. 

“This capability effectively turns Oracle Linux into a service, freeing customers to focus their IT resources on application and user experience, where they can deliver true competitive differentiation,” he said. 

Pure Storage beefs up cloud support


Adam Shepherd

17 Sep, 2019

Pure Storage has today announced the general availability of new data management tools for Azure and AWS as part of its annual Accelerate conference in Austin, Texas, improving its public cloud support and further strengthening its position in the multi-cloud space.

Starting with AWS, the company has announced that its Cloud Block Store for AWS product, first revealed last year, is now generally available for all customers. The product is a wholly software-based offering, allowing customers to use the company’s Purity management software to manage their AWS storage.

The initial beta version of Cloud Block Store used EC2 compute instances with EBS as a storage layer, but the configuration has since changed. As Pure Storage vice president of strategy Matt Kixmoeller explained, the conclusion was that EBS was not reliable enough for the product’s requirements.

«As we worked closely with Amazon, what we found was that EBS didn’t have the reliability characteristics that a Tier 1 storage array needs,» he said. «In particular, there are challenges around coordinated failures, where multiple volumes can fail at once. And so we completely re-architected the backend layer to run natively on S3. S3 is Amazon’s most durable, most reliable storage tier by far – 11 nines of durability.»

«And so we use EBS as a cache to deliver high performance, but persist data on S3. And if you look at most customers, they really treat S3 as their cloud storage. So this solution becomes a way for us to bring a Tier 1 block experience to use in the Amazon cloud storage S3, that customers are most familiar with, and most trust.»

Part of the goal with the new service is to enable workloads to move seamlessly in both directions; from the cloud to the data centre, as well as from the data centre to the cloud. It uses the same management tools and APIs as Pure’s on-prem management software, as well as featuring the ability to run across two availability zones in active/active configuration.

Cloud Block Store for AWS will be available via the AWS Marketplace on either a month-to-month or a one-year contract. Customers who want something more long term can get contracts ranging from one to three years by purchasing through ‘Pure-as-a-Service’, which is a rebranded version of the company’s Evergreen Storage Service, now effectively acting as a subscription-based consumption program.

The other major cloud announcement was the availability of CloudSnap for Azure, a built-in backup mechanism for FlashArray products which lets the Purity management software seamlessly and transparently move snapshots to the public cloud. CloudSnap was initially launched last year with AWS support, but has now been expanded to Azure as well. This, Kixmoeller said, was an excellent example of Pure’s intentions to extend its tools to a multitude of different cloud providers.

«Our strategy at Pure is to absolutely deliver these services as multi-cloud,» he said. «So Cloud Block Store, we started with Amazon – that’s the natural place to start. But as we see more and more adoption, and that gets more mature, and we will of course proliferate to other clouds.»

«It’s not an easy thing for us to snap our fingers and have it available on all three clouds, because we’re doing the hard work of integrating it deeply. And so this is our first example of bringing something to a second cloud.»

As part of the show, the company also announced a capacity-driven flash-based secondary storage appliance with quad-layer cell memory, as well as a new plug-in DirectMemory module for FlashArray//X appliances offering an instant performance boost.

Four ways to migrate to the cloud without missing a beat: A guide

It wasn’t too long ago that the enterprise IT industry considered the cloud an outlier. Companies knew the cloud was a readily available option, but they chose to use it sparingly. Now, however, cloud technology is the first and only option many companies consider.

Time has shown that the cloud's advantages aren’t overstated. It offers the flexibility and scalability necessary to keep customer-facing applications updated. It also extends access to the most innovative enterprise technologies on the market — options that aren’t available outside the cloud.

Still, the cloud's biggest draw continues to be its cost savings. Users only pay for the services they use rather than investing in the hardware needed to run the numerous expensive upfront licences and on-premise software solutions.

These benefits (and others) make cloud migration more of a when than an if for most companies. That said, interested parties need to understand that cloud adoption is not quick or simple, and that an unsuccessful integration could compromise their entire IT infrastructure. No one should hesitate to move to the cloud — but everyone should approach it rationally and deliberately.

Overlooked obstacles to cloud migration

Cloud providers position their solutions as an alternative to on-premise options, which is true but slightly misleading. Cloud technology can replace on-site servers, but it also offers a fundamentally different enterprise IT approach that allows the technology to grow with the company.

Understanding this distinction is important to any migration because a cloud transition requires more than shifting servers and data to a different location. Instead, migrations should be seen as a process of holistic improvement. A cloud transition's true value comes when it is treated as an opportunity to reassess and improve existing processes; the right measures can liberate resources, cut costs, and boost business agility. In that regard, cloud migrations are about doing things better rather than differently.

A move to the cloud is commonly referred to as a "digital transformation," a term that has more to it than what's on the surface. The cloud is more than just a house for information and processes; it's a new operational mindset that affects every facet of the workplace, whether we realise it or not.

That's why holistic migration should be a priority for everyone — especially the executives who oversee projects, control funds, and establish timelines. If they are not on board and pushing for a cheaper, faster, or simpler migration, it compromises the whole effort.

The best way to get executive buy-in is by speaking in terms of real-world money. Moving from CAPEX to OPEX can lead to sticker shock when the numbers aren't contextualised. The total cost of the cloud is significant, but it’s still less than the total cost of on-premise IT when licences, utilities, maintenance, and disaster recovery are factored in.

Companies should be aware of these obstacles but not deterred by them. Approaching a migration the right way ensures it proceeds smoothly and maximises ROI.

Cloud migration done the right way

Enough companies have migrated to the cloud at this point to reveal what works and what doesn’t. Use these strategies to make the transition correctly and comprehensively:

Guide #1: Take inventory of all assets: Cloud migrations can involve tens or hundreds of different assets. If any of them is overlooked or excluded from the migration, the transition and the IT infrastructure are compromised.

That is why it’s important to create a comprehensive IT inventory. Identify everything needed to make the move, building a migration plan that accommodates those diverse assets.

Don't start with a grip-and-rip approach. Begin with a few workloads, ease into things, and then rinse and repeat.

Guide #2: Take a holistic approach: Every part of the migration should be an opportunity for improvement. For example, it may be possible to reduce the number of server roles by leveraging cloud capabilities, particularly for the purpose of business continuity and disaster recovery. Don't just look to transition specific server functions or roles to cloud-equivalent servers — analyse everything.

Guide #3: Take all options into account: There are several migration and cloud options to choose from, so resist the urge to see the shift as a one-size-fits-all effort. Look for every opportunity to embrace new features, adopt new processes, and follow best practices throughout your migration effort.

Do some research to explore just how many different options and approaches are possible, but don't just focus on cutting costs. The cloud also offers intangible benefits such as improved business agility and faster cadence and R&D. Acknowledging these benefits allows companies to prioritise them during the migration.

Guide #4: Take it as a project management opportunity: Cloud migration creates a lot of unknowns, so it’s important to plan, test, and revise every decision beforehand. One way to execute this is to deploy project management principles.

Appoint a manager to spearhead the project and make it his or her responsibility to optimise each detail of the migration. Every resource put into planning pays dividends during and after migrating.

As the future of business continues to digitise, every company will become a technology company. The cloud is becoming ubiquitous because it’s the only option that provides the speed, security, scale, and savings that tomorrow’s companies need from IT.

Cloud migrations can be exciting opportunities, but they shouldn’t lead to hasty decisions or rushed timelines. Companies that execute their cloud migrations carefully and conscientiously will reap the biggest long-term rewards.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Public cloud revenue will reach $500 billion in 2023: The key factors driving it

The pace of cloud computing adoption will accelerate as more organizations explore hybrid IT strategies. CIOs and CTOs will fine-tune the mix of on-premises and managed cloud services for their user's varied applications and workloads.

Worldwide spending on public cloud services and infrastructure will more than double over the 2019-2023 forecast period, according to the latest market study by International Data Corporation (IDC).

With a five-year compound annual growth rate (CAGR) of 22.3 percent, public cloud spending is forecast to grow from $229 billion in 2019 to reach nearly $500 billion in 2023.

Public cloud service market development

"Adoption of public (shared) cloud services continues to grow rapidly as enterprises, especially in professional services, telecommunications, and retail, continue to shift from traditional application software to software as a service (SaaS) and from traditional infrastructure to infrastructure as a service (IaaS) to empower customer experience and operational-led digital transformation initiatives," said Eileen Smith, program director at IDC.

SaaS will remain the largest category of cloud computing, capturing more than half of all public cloud spending in throughout the forecast period. SaaS spending, which is comprised of applications and system infrastructure software (SIS), will be dominated by applications purchases.

The leading SaaS applications will be customer relationship management (CRM) and enterprise resource management (ERM). SIS spending will be led by purchases of security software and system and service management software.

Infrastructure as a service (IaaS) will be the second largest category of public cloud spending. IaaS spending, comprised of servers and storage devices, will also be the fastest growing category of cloud spending with a five-year CAGR of 32 percent.

Platform as a service (PaaS) spending will grow nearly as fast (29.9 percent CAGR) led by purchases of data management software, application platforms, and integration and orchestration middleware.

Three industries – professional services, discrete manufacturing, and banking – will account for more than one-third of all public cloud services spending throughout the forecast period. While SaaS will be the leading category of investment for all industries, IaaS will see its share of spending increase significantly for industries that are building data and compute-intensive services.

For example, IaaS spending will represent more than 40 percent of public cloud services spending by the professional services industry in 2023 compared to less than 30 percent for most other industries. Professional services will also see the fastest growth in public cloud spending with a five-year CAGR of 25.6 percent.

On a geographic basis, the United States will remain the largest public cloud services market, accounting for more than half the worldwide total through 2023. Western Europe will be the second largest market with nearly 20 percent of the worldwide total.

China will experience the fastest growth in public cloud services spending over the five-year forecast period with a 49.1 percent CAGR. Latin America will also deliver strong public cloud spending growth with a 38.3 percent CAGR.

Outlook for cloud service applications growth

Very large businesses will account for more than half of all public cloud spending throughout the forecast period, while medium-sized businesses will deliver around 16 percent of the worldwide total.

Small businesses will trail large businesses by a few percentage points while the spending share from small offices will be in the low single digits.

Moreover, all the company size categories – except for very large businesses – will experience spending growth greater than the overall market.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.