Todas las entradas hechas por timpitcher

The storm brewing: What is fuelling public cloud growth?

(c)iStock.com/jkitan

Public cloud adoption is set to continue to grow in 2016. However, recently published research from IT monitoring provider ScienceLogic confirmed there is still some confusion, which could impact enterprise adoption.

Fundamentally, there is a lack of understanding around the ability to simplify workload visibility and management for IT teams as cloud usage becomes more mainstream. Almost half of the IT decision makers surveyed (46%) were unsure of how to proactively monitor the workloads in their public cloud environments, highlighting the need for a solution offering improved visibility, monitoring and infrastructure control.

Some of this confusion lies in that cloud storage, on which all cloud services are built, has moved on from being used simply for archiving, backup and recovery, and is now being used for business critical applications and infrastructure. Shifting to a system requiring less management is therefore daunting decision to take for many CIOs, even when putting faith in the cloud as a longer-term storage strategy.

1&1 made the decision to switch to a next generation data centre model – the company moved to a new storage architecture to reduce operational management complexity and support the delivery of best in class services to enterprise customers

At the same time, service providers offering cloud solutions need to be confident they can guarantee performance of business critical applications, as even the smallest amount of downtime or reduction in service level could have a massive impact on their clients. For instance, a provider who manages transactional software for a commercial bank or large utilities firm could see damages rack into the millions if an outage occurred even for a few hours.

IT decision makers should shop around for a suitable cloud provider. In doing so, there are a number of key things they should be looking for in their cloud provider, including:

  • Built on an all-flash storage architecture – to enable the delivery of consistent application performance
  • True flexibility and scalability – to allow for linear, predictable performance gains as requirements increase
  • RAID-less data protection – to ensure predictable performance in any failure condition
  • Balanced load distribution – to eliminate unpredictable performance hot spots and control performance independent of capacity.
  • Quality of Service control – to completely eliminate “noisy neighbour” issues, and guarantee performance
  • Performance visualisation – to control performance independent of capacity and on demand

1&1 is an example of a cloud service provider that made the decision to switch to a next generation data centre model. The company moved to a new storage architecture to reduce operational management complexity and support the delivery of best in class services to its enterprise customers. It chose an all-flash array, designed to guarantee performance to clients, which was lacking with the previous system.

At the heart of this issue, regardless of the current system being deployed to deliver cloud services, is the need for businesses to have an infrastructure that provides the right level of support for the applications they are running, be they business critical or not. Predictability, visibility and the ability to control both capacity and performance easily and with agility will be the key differentiator in the marketplace.

This will make the cloud storage sector a particularly interesting one, not just in 2016, but also for the foreseeable future.