Gartner: Public cloud market to reach $246bn in 2017 with IaaS and SaaS at forefront

First IDC made its forecast on the public cloud, and now Gartner has done likewise, predicting that public cloud services market will grow 18% in 2017 to total $246.8 billion (£198.1bn).

IDC’, by comparison, said global spending on public cloud services and infrastructure would reach $122.5bn by the end of this year with seven out of eight primary geographic regions to record CAGRs of more than 20% in the next five years.

Outside of ‘cloud advertising’ – “cloud-based services that support the selection, transaction and delivery of advertising and ad-related data”, according to the analysts – the largest market in 2017 will be software as a service (SaaS), in line with IDC’s predictions. SaaS will overtake BPaaS, cloud business process services this year, while infrastructure as a service (IaaS) will grow to $34bn this year.

By 2020, SaaS will be at $75.7bn, IaaS at $71.5bn, and PaaS at $56.1bn, comprising a total market of $383.3bn, Gartner adds. Growth in the infrastructure compute service space will be enhanced by artificial intelligence (AI), analytics and the Internet of Things (IoT), while the growth of PaaS will also drive the growth of IaaS.

Again, as IDC mentioned, North America is the primary market; more than half of application adoption in the continent will be SaaS or otherwise cloud-related. Gartner also focused on China at a national level, whose forecast was increased to account for anticipated higher buyer demand, saying that ‘while it was nascent and several years behind the US and European markets, it is expected to maintain high levels of growth as digital transformation becomes more mainstream over the next five years.’

“Organisations are pursuing strategies because of the multidimensional value of cloud services, including values such as agility, scalability, cost benefits, innovation and business growth,” said Sid Nag, Gartner research director. “While all external-sourcing decisions will not result in a virtually automatic move to the cloud, buyers are looking to the ‘cloud first’ in their decisions, in support of time to value impact via speed of implementation.”

You can find out more here.

 

Want to find out more about how artificial intelligence (AI) can affect your business? The AI Expo world series brings together brands, evangelists and start-ups to explore leading developments in the enterprise and consumer sectors. Find out more here.

Gartner: Public cloud market to reach $246bn in 2017 with IaaS and SaaS at forefront

First IDC made its forecast on the public cloud, and now Gartner has done likewise, predicting that public cloud services market will grow 18% in 2017 to total $246.8 billion (£198.1bn).

IDC’, by comparison, said global spending on public cloud services and infrastructure would reach $122.5bn by the end of this year with seven out of eight primary geographic regions to record CAGRs of more than 20% in the next five years.

Outside of ‘cloud advertising’ – “cloud-based services that support the selection, transaction and delivery of advertising and ad-related data”, according to the analysts – the largest market in 2017 will be software as a service (SaaS), in line with IDC’s predictions. SaaS will overtake BPaaS, cloud business process services this year, while infrastructure as a service (IaaS) will grow to $34bn this year.

By 2020, SaaS will be at $75.7bn, IaaS at $71.5bn, and PaaS at $56.1bn, comprising a total market of $383.3bn, Gartner adds. Growth in the infrastructure compute service space will be enhanced by artificial intelligence (AI), analytics and the Internet of Things (IoT), while the growth of PaaS will also drive the growth of IaaS.

Again, as IDC mentioned, North America is the primary market; more than half of application adoption in the continent will be SaaS or otherwise cloud-related. Gartner also focused on China at a national level, whose forecast was increased to account for anticipated higher buyer demand, saying that ‘while it was nascent and several years behind the US and European markets, it is expected to maintain high levels of growth as digital transformation becomes more mainstream over the next five years.’

“Organisations are pursuing strategies because of the multidimensional value of cloud services, including values such as agility, scalability, cost benefits, innovation and business growth,” said Sid Nag, Gartner research director. “While all external-sourcing decisions will not result in a virtually automatic move to the cloud, buyers are looking to the ‘cloud first’ in their decisions, in support of time to value impact via speed of implementation.”

You can find out more here.

The spotlight falls on data centre resiliency as data keeps growing

(c)iStock.com/Koldunov

It’s hard to imagine the sheer scale of data created on a daily basis. In just one second 747 Instagram photos are uploaded, 7,380 tweets are sent and 38,864GB of traffic is processed across the internet. This equates to 2.5 quintillion bytes of data every single day. With IoT connections booming, this number is expected to keep growing exponentially. Almost all this data travels through a data centre at some point.

Businesses are handing the hosting and management of their infrastructure and systems software to third-party data centre operators, a move that has enabled companies of all sizes to become more agile and cost-conscious.

This phenomenon is projected to grow, with 86% of workloads expected to be processed by cloud data centres by 2019, and only 14% by traditional data centres. Perhaps even more striking is that the same forecast indicates 83% of data centre traffic will be cloud traffic in the next three years.

This explosion in data, cloud applications, services and infrastructure has brought about a change in data centre usage which in turn has also demanded a change in physical facilities.

It is essential four features are woven into the design and functionality of every data centre – scalability, availability, resiliency and security. Outsourced data centre owners must be able to handle a surge in demand – without adequate capacity and environmental monitoring servers can quickly become overworked and cause outages.  

In addition, data centres have to demonstrate resiliency in order to reassure their customers. Corporate enterprises, particularly those who have migrated to hybrid environments, live in fear of an outage and the resulting impact on costs and reputation. And with good reason.

Downtime damage

In autumn 2015 a data centre owned by Fujitsu suffered a power outage which took down a number of cloud services. This was not a short-lived problem; the effects persisted for some time and affected customers on the Fujitsu public cloud and its private hosted cloud as well as other infrastructure services.  

As with Fujitsu, data centre and service availability can be disrupted in many ways. Power supply failure is one of the biggest causes, as are cyber-attacks, but data centres can also be affected by overheating if efficient cooling is not in place, or even by extreme weather incidents. Examples vary from the mundane to the unbelievably absurd.

Despite the risks of failure, few of the listed scenarios actually have to result in downtime if there is a good understanding of the data centre environment, a suitable level of real-time operational intelligence and procedures are in place to identify issues before they can lead to disaster or failure.

Sophisticated solutions are available to provide real-time insight, control and predictability that help data centre managers to deal with environmental and operational challenges. Environmental conditions can be monitored constantly for any potential issues, and assets tracked and managed to maintain their performance and guard against technical breakdown.

As data continues to grow and cloud traffic increases, utilising intuitive insight and fit-for-purpose tools such as those described above will help data centres and their operators to maintain resilience, ensure uptime and support their customers as they move away from internally managed IT estates.

The spotlight falls on data centre resiliency as data keeps growing

(c)iStock.com/Koldunov

It’s hard to imagine the sheer scale of data created on a daily basis. In just one second 747 Instagram photos are uploaded, 7,380 tweets are sent and 38,864GB of traffic is processed across the internet. This equates to 2.5 quintillion bytes of data every single day. With IoT connections booming, this number is expected to keep growing exponentially. Almost all this data travels through a data centre at some point.

Businesses are handing the hosting and management of their infrastructure and systems software to third-party data centre operators, a move that has enabled companies of all sizes to become more agile and cost-conscious.

This phenomenon is projected to grow, with 86% of workloads expected to be processed by cloud data centres by 2019, and only 14% by traditional data centres. Perhaps even more striking is that the same forecast indicates 83% of data centre traffic will be cloud traffic in the next three years.

This explosion in data, cloud applications, services and infrastructure has brought about a change in data centre usage which in turn has also demanded a change in physical facilities.

It is essential four features are woven into the design and functionality of every data centre – scalability, availability, resiliency and security. Outsourced data centre owners must be able to handle a surge in demand – without adequate capacity and environmental monitoring servers can quickly become overworked and cause outages.  

In addition, data centres have to demonstrate resiliency in order to reassure their customers. Corporate enterprises, particularly those who have migrated to hybrid environments, live in fear of an outage and the resulting impact on costs and reputation. And with good reason.

Downtime damage

In autumn 2015 a data centre owned by Fujitsu suffered a power outage which took down a number of cloud services. This was not a short-lived problem; the effects persisted for some time and affected customers on the Fujitsu public cloud and its private hosted cloud as well as other infrastructure services.  

As with Fujitsu, data centre and service availability can be disrupted in many ways. Power supply failure is one of the biggest causes, as are cyber-attacks, but data centres can also be affected by overheating if efficient cooling is not in place, or even by extreme weather incidents. Examples vary from the mundane to the unbelievably absurd.

Despite the risks of failure, few of the listed scenarios actually have to result in downtime if there is a good understanding of the data centre environment, a suitable level of real-time operational intelligence and procedures are in place to identify issues before they can lead to disaster or failure.

Sophisticated solutions are available to provide real-time insight, control and predictability that help data centre managers to deal with environmental and operational challenges. Environmental conditions can be monitored constantly for any potential issues, and assets tracked and managed to maintain their performance and guard against technical breakdown.

As data continues to grow and cloud traffic increases, utilising intuitive insight and fit-for-purpose tools such as those described above will help data centres and their operators to maintain resilience, ensure uptime and support their customers as they move away from internally managed IT estates.

Budget Focus on Digital Will Transform the IT Industry | @CloudExpo #Cloud #DigitalTransformation

Known on the global stage for its indisputable domination in IT, India is at the cusp of a digital revolution, finally on home ground. With its strong push for a digital economy, the Union Budget for 2017, presented by the Finance Minister recently, has only validated this claim. As a member of the IT community, it is a truly exciting moment to witness the sweeping transformative power of technology, as it transcends barriers of literacy, class and industry to create an environment of inclusivity – social, economic and financial.

read more

How #Serverless Is Taking Over #DevOps | @DevOpsSummit #AWS #Lambda

With big corporations moving to the cloud for their computation needs, a new type of resource usage model has come up recently that is called Server-less Computing. Contrary to the meaning of the term “Serverless” in “Serverless Computing”, it does not mean that servers are not involved. Rather, it means that the developer is no longer required to worry about provisioning and maintaining servers.

read more

[video] @Ericsson #IoT with @EsmeSwartz | @ThingsExpo @EricssonIT #M2M

“I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required,” explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.

read more

[video] @Ericsson’s #IoT Ecosystems | @ThingsExpo @EricssonIT @EsmeSwartz

The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging – how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new data-driven world, marketplaces reign supreme while interoperability, APIs and applications deliver unique customer value for new go-to-market models based on a cloud and IoT way of working. Analytics, IoT service mashups, fail fast business models and contextual data streams enable data to become the new currency for digital citizens and businesses and will determine business success or failure.

read more

Side of #Serverless BS with Your Hardware FUD | @CloudExpo #SDN #AI #SDDC

A few years ago a popular industry buzzword term theme included server less and hardware less. It turns out, serverless BS (SLBS) and hardware less are still trendy, and while some might view the cloud or software-defined data center (SDDC) virtualization, or IoT folks as the culprits, it is more widespread with plenty of bandwagon riders. SLBS can span from IoT to mobile, VDI and workspace clients (zero or similar), workstations, server, storage, networks. To me what’s ironic is that many purveyors of of SLBS also like to talk about hardware.

read more

[video] @IBMCloud and @NVIDIA #AI Keynote | @CloudExpo #ML #DL #IoT

Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O’Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to “every day,” on-demand. They also reviewed two “free infrastructure” programs available to startups and innovators.

read more