Archivo de la categoría: software defined storage

Software defined storage and security drive cloud growth, say studies

Cloud securityData centre builders and cloud service developers are at loggerheads over their priorities, according to two new reports.

The explosive growth of modern data centres is being catalysed by new hyperconverged infrastructures and software defined storage, says one study. Meanwhile another claims that enthusiasm for cloud projects to run over this infrastructure is being suffocated by security fears.

A global study by ActualTech Media for Atlantis Computing suggests that a large majority of data centres are now using hyperconverged infrastructure (HCIS) and software defined storage (SDS) techniques in the race to built computing arenas. Of the 1,267 leaders quizzed in 53 countries, 71 per cent said they are using or considering HCIS and SDS to beef up their infrastructure. However, another study, conducted on behalf of hosting company Rackspace, found that security was the over riding concern among the parties who will use these facilities.

The Hyperconverged Infrastructure and Software-Defined Storage 2016 Survey proves there is much confusion and hype in these markets, according to Scott D. Lowe, a partner at ActualTech Media, who said there is not enough data about real-world usage available.

While 75 per cent of data centres surveyed use disk-based storage, only 44 per cent have long term plans for it in their infrastructure plans and 19 per cent will ditch it for HCIS or SDS. These decisions are motivated by the need for speed, convenience and money, according to the survey, with performance (72 per cent), high availability (68 per cent) and cost (68 per cent) as top requirements.

However, the developers of software seem to have a different set of priorities, according to the Anatomy of a Cloud Migration study conducted for Rackspace by market researcher Vanson Bourne. The verdict from this survey group – 500 business decision markers rather than technology builders – was that security will be the most important catalyst and can either speed or slow down cloud adoption.

Company security was the key consideration in the top three motives named by the survey group. The biggest identified threat the survey group wanted to eliminate was escalating IT costs, which 61 per cent of the group named. The next biggest threat they want to avert is downtime, with 50 per cent identifying a need for better resilience and disaster recovery from the cloud. Around a third (38 per cent) identified IT itself as a source of threats (such as viruses and denial of service) that they would want a cloud project to address.

“Cloud has long been associated with a loss of control over information,” said Rackspace’s Chief Security Officer Brian Kelly, “but businesses are now realising this is a misconception.”

Software-defined storage vendor Scality nabs $45m to prep for IPO

Scality has secured $45m in its latest funding round and plans to go public in 2017

Scality has secured $45m in its latest funding round and plans to go public in 2017

Software-defined storage expert Scality has secured $45m in a funding round led by Menlo Ventures, which the company said will be used to fuel its North American and international expansion.

Scality’s offering uses object storage to abstract underlying hardware to create a single pool of storage that can be manipulated with a wide range of protocols and technologies (SMB, Linux FS, OpenStack Swift, etc.).

The company, which offers storage software and has large reseller agreements in place with big box vendors like HP and Dell, has secured over $80m since its founding in 2009. It claims over 50 per cent of the server market is now reselling its SDS software.

“There’s no doubt in my mind that today, Scality is the biggest disruptor of the traditional storage industry, and I am extremely excited to witness their progression,” said Douglas C. Carlisle, managing director at Menlo Ventures.

“Their innovative storage model is meeting demand for scale like no other product on the market, and is poised to keep up with the steep incline in data volumes. With Jerome’s forward-thinking mindset, we expect to see Scality continue to be a trailblazer and to take its RING technology to the next level.”

The company has spent the better part of the past two years scaling up its operations in Asia and Europe, but it said the new funding will go towards bolstering its North American presence, with a view towards releasing and IPO in 2017.

“Over the course of the last year-and-a-half, we’ve seen an unprecedented amount of funding given to software storage startups. At the same time, we’ve seen the traditional storage vendors lose market share, change leadership and shift their business model to mimic the software-defined strategy. This latest funding round comes at a time when Scality and the software-defined storage industry are poised to attract billions of dollars from customers that are rethinking their storage strategies,” said Jerome Lecat, chief executive at Scality.

“Our employees and partners believe in us, and the fact that this last funding round was done at 2x valuation speaks volumes about the overall confidence in the future of Scality. This new capital investment will allow us to massively boost our go-to-market, attract strategic new hires, continue to expand globally, and be primed for a successful IPO by 2017,” Lecat said.

Hedvig bags $18m for software-defined storage

Hedvig secured $18m this week which will help fuel expansion of its software-defined storage offering

Hedvig secured $18m this week which will help fuel expansion of its software-defined storage offering

Distributed storage platform provider Hedvig has secured $18m in a round of funding the company said will be used to double down on development and expansion.

In the cloud space storage heterogeneity can cause big performance bottlenecks – particularly in tightly integrated systems, which many applications and services are quite clearly becoming – and legacy datacentres are struggling to keep pace.

Hedvig, which came out of stealth earlier this year and was founded by former Facebook and Amazon NoSQL and storage specialist Avinash Lakshman (also the brains behind Cassandra), offers a highly scalable storage platform (block, file and object) that the company says provides fully programmable, highly granular storage provisioning – software-defined storage in other words.

The platform supports pretty much every hypervisor or Linux container service above it, and uses REST-based APIs so cloud users can tap into the platform in a fairly straightforward way.

The investment round, which brings the total amount raised by the firm to just over $30m, was led by Vertex Ventures with participation from existing investors True Ventures and Atlantic Bridge. As part of the deal Vertex Ventures General Partner In Sik Rhee will be joining Hedvig’s board of directors.

“We’ve identified the potential in a broken and fragmented storage market, and are not only looking to bring software-defined storage mainstream, but fundamentally change how companies store and manage data,” Lakshman said.

“Riding the wave of momentum from our recent company launch, this new investment round further validates our technology and approach, and will fuel our unwavering commitment to be the leading force of innovation in software-defined storage.”

Hedvig’s success comes at a time of rising popularity of the concept of the software-defined datacentre, which sees the orchestration of almost everything – storage, compute, networking – through software.

2015 Predictions: Cloud and Software-Defined Technologies

As we kick off the new year, it’s time for us to get our 2015 predictions in. Today, I’ll post predictions from John Dixon around the future of cloud computing as well as from our CTO Chris Ward about software-defined technologies. Later this week, we’ll get some more predictions around security, wireless, end-user computing& more from some of our other experts.

John Dixon, Director, Cloud Services
On the Internet of Things (IoT) and Experimentation…
In 2015, I expect to see more connected devices and discussion on IoT strategy. I think this is where cloud computing gets really interesting. The accessibility of compute and storage resources on the pay-as-you-go model supports experimentation with a variety of applications and devices. Will consumers want a connected toaster? In years past, companies might form focus groups, do some market research, etc. to pitch the idea to management, get funding, build a team, acquire equipment, then figure out the details of how to do this. Now, it’s entirely possible to assign one individual to experiment and prototype the connected toaster and associated cloud applications. Here’s the thing; the connected toaster probably has about zero interest in the market for consumer appliances. However, the experiment might have produced a pattern of a cloud-based application that authenticates and consumes data from a device with little or no compute power. And this pattern is perhaps useful for other products that DO have real applications. In fact, I put together a similar experiment last week with a $50 Raspberry Pi and about $10 of compute from AWS — the application reports on the temperature of my home-brew fermentation containers, and activates a heating source when needed. And, I did indeed discover that the pattern is really, really scalable and useful in general. Give me a call if you want to hear about the details!

On the declining interest in “raw” IaaS and the “cloud as a destination” perspective…

I’ve changed my opinion on this over the past year or so. I had thought that the declining price of commodity compute, network, and storage in the cloud meant that organizations would eventually prefer to “forklift move” their infrastructure to a cloud provider. To prepare for this, organizations should design their infrastructure with portability in mind, and NOT make use of proprietary features of certain cloud providers (like AWS). As of the end of 2014, I’m thinking differently on this — DO consider the tradeoff between portability and optimization, but… go with optimization. Optimization is more important than infrastructure portability. By optimization in AWS terms, I mean taking advantage of things like autoscaling, cloudwatch, S3, SQS, SNS, cloudfront, etc. Pivotal and CloudFoundry offer similar optimizations. Siding with optimization enables reliability, performance, fault tolerance, scalability, etc., that are not possible in a customer-owned datacenter. I think we’ll see more of this “how do I optimize for the cloud?” discussion in 2015.

2015 predictions

Chris & John presenting a breakout session at our 2014 Summit Event

Chris Ward, CTO

On SDN…

We’ll see much greater adoption of SDN solutions in 2015.   We are already seeing good adoption of VMware’s NSX solution in the 2nd half of 2014 around the micro segmentation use case.  I see that expanding in 2015 plus broader use cases with both NSX and Cisco’s ACI.  The expansion of SDN will drag with it an expansion of automation/orchestration adoption as these technologies are required to fully realize the benefits of broader SDN use cases.

On SDS…

Software defined storage solutions will become more mainstream by the end of 2015.  We’re already seeing a ton of new and interesting SDS solutions in the market and I see 2015 being a year of maturity.  We’ll see several of these solutions drop off the radar while others gain traction and I have no doubt it will be a very active M&A year in the storage space in general.

 

What do you think about Chris and John’s predictions?

If you would like to hear more from these guys, you can download Chris’ whitepaper on data center migrations and John’s eBook around the evolution of cloud.

 

By Ben Stephenson, Emerging Media Specialist