Is the cloud the next thing for long-term data retention? Looking at the key vendors in the space

For any organisation in this era, there is a realisation on how data is critical for business needs and operations.

An enormous amount of data has been produced already after the disruption of cloud computing into various types of organisation, be it education, finance, healthcare or manufacturing. Today, organisations are more concerned about the data which has been developed in the last 15 to 20 years due to the surge of IT infrastructure.

This data and applications are probably not being used actively, but it is important to organisations as this data contains critical information, having compliance requirements around it. Security of old data (unstructured content, applications, virtual machines) is becoming crucial for the organisation. There has to be a cost effective and reliable archiving solution to store and secure data while gaining rapid access when needed.

In the past, IT management used to save the data in tape drives or on premises data centres without any filtering. But the data demands have drastically changed.

Even more data will be produced in the next five to seven years as more digitally connected devices become part of business operations. Data will be fuel for any business as they will abstract analytical information to get ahead of the competition or to be aligned with consumer demands. This digital transformation is not just to acquire new technology enhancement but to save CAPEX and OPEX every time when the data centre moves ahead in innovations.

As data grows, edge computing architecture will enable data centre systems to get closer to digital devices for processing of information (machine learning/analysis) and only a small set of information will be pushed to the cloud or private data centre.

How will organisations deal will past data when real-time data will also need to get archived for reference? How will organisations deal with data in hybrid cloud or a multi-cloud model where private and public cloud will be utilised for different data processing purposes? Will there be automation available for constantly syncing data based on archival methods that will get integrated in an archival strategy? What about the security from external breaches or physical damages to archival systems?

There are various vendors who have developed solutions to address these needs. Organisations have different choices to select a solution which fits their requirements and can be customised as per the budget. In this post, I have taken a look at data archival solutions from leading vendors like Rubrik, Cohesity and Zerto. Let’s evaluate their solutions.

Cohesity: Enterprise-grade long-term retention and archival

Cohesity’s solutions allow you to leverage both cloud and tapes to archive the data based on the organisation's requirements. The solution they call cloud-native is where, apart from tapes, archival is possible on public clouds, private clouds, Amazon S3-compatible devices and QStar managed tape libraries. The solution enables IT management to define workflow policies for automated backup and archival. It consists of two Cohesity products: Cloud Archive & Data Protect.

Cloud Archive allows to leverage public cloud for long term data retention, while Data Protect helps to reduce long term retention and archival cost with its pay as you go cost model.

Rubrik: Data archival

Rubrik’s solution provides support to organisations for data management on hybrid cloud environments. Organisations can choose their storage and architecture containing:

  • Archive to Google Cloud Storage
  • VMware vSphere, Nutanix AHV, and Microsoft Hyper-V
  • Microsoft SQL Server
  • Oracle, Linux, Windows, UNIX, and NAS
  • Remote and branch offices

The client uses real time predictive global search to access the archived data. You will see files directly from the archive as you type in the search box. This drastically reduces access time for your files. Also, it is possible to instantiate VMs in the cloud itself with Rubrik's solution. 

Data deduplication is used while accessing the data which further reduces transfer and storage costs. With this solution, all the data is encrypted before being send from physical devices to target storage infrastructure. A user is presented with a simple HTML5 responsive interface to set up a policy driven automation and target for archival.

Zerto: Zerto virtual replication

Zerto offers a different solution for archival of data compared with Rubrik and Cohesity. Zerto does the archival of data using an ad-hoc feature in its main software Zerto Virtual Replication. With this feature, it is possible to take daily, weekly and monthly backup of data to be archived. It is possible to use target for archival on tapes, network share in a third location, dedicated disk-based backup device or even cheap S3 or Blob Storage in AWS or Azure.

The latest release supports continuous data protection (CDP), replication, automated orchestration and long-term retention with offsite backup. Journal File Level Recovery mechanism is used to restore backup data quickly.

Conclusion

Apart from Rubrik, Cohesity and Zerto, there are more vendors who have offered different types of solutions for different workloads and for diverse requirements. But these three can be useful in most of the new age workloads like data generated by IoT devices, machine learning analysis data and unstructured big data lakes.

As organisations are evaluating new technologies to deal with data, a proper archival or long term retention solution will help them to get most of the past data and allow them to focus on newly generated data. As per this evaluation, it is clear that most vendors are focused towards utilizing public cloud or hybrid cloud environments to archive the long-term data. Use of the hybrid cloud means that private cloud can be used to store data, which is bound by compliance and security norms critical to organisations. But it will be completely up to the organisations to which solution they would like to go with as there are good options available.

The post Is the Cloud Next Thing for Long Term Data Retention or Archival? appeared first on Calsoft Inc. Blog.

Editor's note: Download the eBook NVMe: Optimizing Storage for Low Latency and High Throughput.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.