Google Cloud Platform (GCP) has developed a software service to help organisations handle massive data transfers between on-premise locations and the cloud faster and more efficiently than existing tools.
The tool has been designed for organisations that need to undergo large-scale data transfers in the region of billions of files, or petabytes of data, between physical sites to Google Cloud storage in one fell swoop.
GCP’s Transfer Service for on-premises data, released in beta, is also a product that allows businesses to move files without needing to write their own transfer software or invest in a paid-for transfer platform.
Google claims custom software options can be unreliable, slow and insecure as well as being difficult to maintain.
Businesses can use the service by installing a Docker container, with an agent for Linux, on data centre computers, before the service co-ordinates the agents to transfer data safely to GCP storage.
The system makes the transfer process more efficient by validating the integrity of the data in real-time as it gradually shifts to the cloud, with an agent using as much available bandwidth to reduce transfer times.
The data transfer service is a larger-scale version of tools such as gsutil, a cloud transfer service also developed by Google, which is unable to cope with the scale of data that Transfer Service has been designed to handle.
The firm has recommended that only businesses with a network speed faster than 300Mbps use its Transfer Service, with gsutil sufficing for those with slower speeds.
Customers also need a Docker-supported 64-bit Linux server or virtual machine that can access the data to be transferred, as well as a POSIX (Portable Operating System Interface)-compliant source.
The product is aimed squarely at enterprise users, and comes several weeks after the company announced a set of migration partnerships aimed at customers running workloads with the likes of SAP, VMware and Microsoft.