Looking to the ‘HyPE’ of cloud storage: How HPE is looking to help with hybrid cloud

Analysis Cloud storage is old hat right?  It’s the simpler part of cloud and is after all just storage. So how can cloud storage be more interesting to explore and deliver greater value to the customer?

Having had a recent expert and exclusive briefing from inside HPE from my position as an industry cloud influencer, there is a powerful and relevant story to tell that forms a base of the HPE cloud strategy and a value for those in cloud to be cognisant of for the opportunity it presents.

We live in a time of cloud and hybrid cloud is rapidly becoming the norm, if it is not already. As we approach 2020 an average of about a third of companies’ IT budget is reported as going towards cloud service with, according to Forbes, an expectation that around 83% of Enterprise Workloads expected to be in the cloud by the end of 2020. Of course, these shall be spread across varying cloud form factors and offerings encompassing SaaS, PaaS and IaaS, private, public and of course hybrid clouds.

Hybrid cloud is where the greatest challenge appears to lay. Whether by accident or strategically, most firms are unable to align to a singular cloud provider through one platform to meet all the need of their business, users and customers. Much like days of the past, where businesses mixed Unix, Netware, Lan Manager and other operating systems to support the applications required by the business, today this has become a hybrid cloud environment.

Industry trends align to validate this with Gartner reporting that 81% of public cloud users choose between two or more providers and with Canalys taking this deeper, citing that Amazon, Microsoft, and Google combined accounted for 57% of the global cloud computing market in 2018. The average Business is running 38% of Workloads in public clouds and 41% in private clouds with Hybrid cloud running at a 58% adoption rate according to Rightscale industry figures.

This growth of hybrid is driving an increasing challenge for the CTO/CxO, that of Data portability. “How do I maintain resiliency and security across hybrid and multi-cloud environments and get the benefits of cloud with the values of on premise I enjoyed?”… “How do I have storage in the cloud behave in away I am used to from on premise?” The want for consistency of data services and to be able to run data back out of cloud if and when wanted is also a key driver.

We have seen cloud make it easy to spin up and speed forwards using Agile and DevOps as attractive rewards. However, as customer’s cloud usage has rapidly matured demands on the platforms and pressures of mobility and portability have driven greater demands on storage flexibilities.

The customer focus of moving applications to the cloud has revolved mostly around the selection of the compute platform, the lift and shift, leaving the storage focused issue to rear its head later, with many experiencing the latter shock factor of cost and tie in issues. We have also seen customers maturing use and demands of cloud platforms drive innovation of periphery cloud services as evidenced here in the area of storage.

So, what do you do when public cloud storage does not meet all your storage needs? Let’s start from the offering of a true commercial utility-based model aligned with a focus on performance and storage needs. HPE is allowing you to abstract your data store from the public cloud in a Storage as a service offering that frees you from ties to any singular public cloud offering. Put your data in and then decide which public cloud(s) do you want to access the data set, knowing you can move data in and out as you want to. The key is that the storage becomes extrapolated from the compute, a positive step towards true portability across the major public cloud compute offerings.

Imagine combining public cloud compute with its high SLA on compute with a data storage set with a 99.9999% SLA and having the ability to easily switch compute providers if and when you choose leaving the data set intact. Moving compute more easily between AWS, Azure and Google Compute is the panacea for many.  In fact in the Turbonomic’s 2019 State of Multicloud  report, 83% cited expecting workloads to eventually move freely between clouds. We are seeing the first steps here to the expectation becoming a reality.

The clever market offering that will prove attractive here is the commercial offering will deliver one flat and clear billing model across all clouds with no egress charges. Both technically and commercially HPE Cloud Volumes is setting out to make the complex and critical needs of storage simple and increasingly affordable, flexible and importantly portable.  Through this HPE is setting its stall to be a key cloud transformation partner for business.

HPE is stepping the game up through acquired technologies to service, support and supplement the needs of the high growth public cloud consumption. Their offering will not be right for every customer in every public cloud, but for its specific use case offers a valuable option. The offering as would be expected is for Block and not Object storage, but it remains that this addresses a large segment of the cloud workload storage requirements for most corporate entities.

The promise is portable cloud storage across compute platforms with on the fly commercial transparency.  This removes the tie in to any public cloud offering such as AWS, Azure or Google Compute. You do of course tie your storage into HPE Cloud Volumes (although without the egress charges), but by agnosticising your storage you allow greater flex to mix/match and change between the major cloud platforms, something lacking for many today.

Are we going to see the question change from where is my data, to where do you want it to be?  The HPE offering is one of portability and operability, bringing on premise flexibility, security and portability to cloud storage.

Separating storage from compute workloads is an enabling factor for the flexibility of moving between cloud compute offerings for DR, testing or simply for when a switch is wanted. To deliver a solution without introducing latency, HPE has had to align its locations with the mainstream public cloud providers. As would be expected both Docker and Kubernetes are inherently supported, key to make the offering fit the increasingly open DevOps world of public cloud.

The extrapolation of storage is smart presentation of value from HPE to the exploding cloud market and the needs of customers for greater flexibility and portability.  We should not forget that one of the drivers for cloud adoption is the capability to access data from anywhere at anytime easily and according to a Sysgroup study “Providing access to data from anywhere is the main reason for cloud adoption.”

We also heard about Infosight – the hidden gem in the HPE kingdom – in simplistic terms this is an offering that utilises AI to take telemetry data and advise customers of an issue forth coming and what to do about it, before it has impact! So, apply this to Cloud Volumes and you have a compounding value of maximising your storage when and where you need with maximum reliability and predictability.

Customers are seeking increased Data mobility and portability – the panacea promise of cloud solutions and the ability to move to/from compute offerings from varying vendors quickly and easy. Excitingly, HPE has strategised that by 2020 everything it sells will be available ‘as a service’. Do we see a new ‘HPEaaS’ ahead? This will form a strong foundation for HPE to make a big noise alongside the explosive growth of the public cloud space and positions a new offering much needed at the centre of the public cloud battle as it continues.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.