Santa Clara based start-up Cohesity claims it will be able to drastically reduce the escalating costs of secondary storage.
The new Cohesity Data Platform achieves this, it reckons, by consolidating all the diverse backup, archive, testing, development and replication systems onto a single, scalable entity.
In response to feedback from early adopters, it has now added site-to-site replication, cloud archive, and hardware-accelerated, 256-bit encryption to version 2.0 of the Data Platform (DP).
The system tackles one of the by-products of the proliferation of cloud systems, the creation of fragmented data silos. These are the after effects of the rapid unstructured growth of IT which led to the adoption of endless varieties of individual systems for handling backup, file services, analytics and other secondary storage use cases. By unifying them, Cohesity claims it can cut the storage footprint of a data centre by 80%. It promises an immediate tangible return on investment by obviating the need for backup.
Among the time saving features that have been added to the system are automated virtual machine cloning for testing and development and a newly added public cloud archival tier. The latter gives enterprise users the option of spilling over their least-used data to Google Cloud Storage Nearline, Microsoft Azure and Amazon S3 and Glacier in order to cut costs. The Cohesity Data Platform 2.0 also provides ‘adaptive throttling of backup streams’, which minimises the burden that storage places on the production infrastructure.
“We manage data sprawl with a hyperconverged solution that uses flash, compute and policy-based quality of service,” said Cohesity CEO Mohit Aron.