Accelerating cloud storage and reducing the effects of latency

(c)iStock.com/ktsimage

Over a number of years there has been a long and hard fought battle to secure the ability to‘accelerate anywhere’any data type to, from and across a cloud area network (ClAN) to allow fast access to applications, or to secure data as a part of a back-up and archiving strategy. According to Claire Buchanan, chief commercial officer at self-configuring infrastructure optimise networks (SCION) vendor Bridgeworks, this battle is still ongoing. With the use of traditional WAN optimisation techniques the long drawn out battle has still to be won.

“It may not long be the case as with the advent of machine intelligence and technologies such as SCION, The problem has been that of small pipes and the inability to accelerate data.  Therefore, the use of deduping and compression tools have been the only way to gain a perceived performance improvement”, she explains. 

With this in mind Tony Lock, programme director at analyst firm Freeform Dynamics, advises people to closely scrutinise the available WAN acceleration solutions against their business level requirements for WAN network performance. “They need to match them to what is currently being delivered combined with assessing what improvements can be achieved”, he adds.

Yet larger network connections or ‘pipes’ of 100Mb/s ,1Gb/s  and greater are becoming the norm. So Buchanan therefore thinks that the main challenge has changed to one of how to fill the pipes in order to maximise the utilisation of the network rather than minimising the amount of data sent. “With SCION this can be achieved and the network performance battle can be won”, she claims. With SCION she argues that the traditional problems relating to WANs are flipped on their head, because she says the technology works “sympathetically with TCP/IP in tandem with its strengths whilst overcoming its greatest inhibitor – latency.”

Mitigating latency

Mitigating latency is a crucial challenge because it can slow down the transmission of data to and from   public, private and hybrid clouds. It can make back-up and disaster recovery more challenging than it really need be. Buchanan therefore argues that this can, however, be resolved by radically reducing the effects of latency. Performance and utilisation can climb up beyond 90% to allow data to move close to the maximum capability of an organisation’s bandwidth. This in turn will make it easier for customers to move data to the cloud, and they will gain the ability to spin up at will and spin down their servers were it is deemed appropriate.

Lock adds: “Everything depends on the nature of the applications and business services being run over the communications links, and so the tip is to ensure that you really understand what is needed to warrant that the business gets the service quality it needs from the user of the application.” He says it is also essential to make sure that IT understands how to manage and administer it over its working life, which could be over the course of many years. “The key is to put in place good management tools and processes – especially if these are new operations to IT”, he suggests.

Data deduplication

In many cases machine learning technologies such as SCIONs will limit the need for human intervention and enable the more efficient management of network performance. Yet Buchanan says deduplication has traditionally always been an “acceptable way to move data where pipe size is a limitation and chatty protocols are the order of the day, but it is heavy on computational power, memory and storage.” She therefore advises organisations to ask questions:

  • What is the hidden cost of WAN optimization: What is the cost of the kit to support it? As one technology starts to peak at, for example, 1Gb/s you have to look at the return on investment. With deduplication you have to look at the point where the technology tops out as performance of the technology flattens off, and the cost benefit ratio weakens. Sometimes it’s better to take a larger pipe with different technology to get better performance and ROI.
  • Are the traditional WAN optimisation vendors really offering your organisation what it needs?  We are now seeing Vendors other than WAN Optimisation vendors, that are increasingly using deduplication and compression as part of their offering  As it’s not possible to “dedupe data already deduped”. This means that the traditional WAN optimisation tools simply pass the data through untouched and therefore no performance improvement. 
  • What will traditional WAN optimisation tools become in the new world of larger pipes?  Lock adds that “data deduplication is now reasonably mature, but IT has to be comfortable that it trusts the technology and that the organisation is comfortable with the data sets on which it is to be applied.” He also says that there are some industries that may require a sign-off by auditors and regulators on the use of deduplication on certain data sets.

Fast restoring

Organisations that what to fast restore encrypted data from off-site facilities need to consider the network delays caused by latency. “This is coloured by IT executives thinking with regards to the location of their secondary and tertiary datacentres, and so they have sought to minimise time and perceived risk by locating their datacentres within the circle of disruption”, says Buchanan.

She adds that distance is normally a reflection of latency as measured by milliseconds, but this apparently isn’t always the case dependent on the network. The laws of physics doesn’t allow for latency to be eliminated, but they can be mitigated with SCION technologies. She argues that SCIONS can enable organisations to move encrypted data just as fast as anything else because it doesn’t touch the data and is therefore data agnostic.

Lock advises that there are many factors that have to be considered, such as the location of the back-up data, the resources available (network, processors, storage platforms and so on) to perform the restoration of the encrypted data. “However, the long-term management of the encrypted keys will certainly be the most important factor, and it’s one that can’t be overlooked if the organisation needs large scale data encryption”, he explains.

With regards to SCION he says that traditional WAN networks have been static: “They were put in place to deliver certain capacities with latency, but all resilience and performance capabilities were designed up-front and so the ideas behind SCION, looking at making networks more flexible and capable of resolving performance issues automatically by using whatever resources are available to the system – not just those furnished at the outset is an interesting divergence.”

Differing approaches

According to Buchanan the traditional premise has been to reduce the amount of data to send. “In contrast SCION comes from the premise of acceleration, maximising the efficiency of the bandwidth to achieve its ultimate speed”, she explains.

In her opinion the idea is that by paralleling data on virtual connections, filling the pipes and then using machine intelligence to self-configure, self-monitor and self-manage the data that is controlled from ingress to egress ensures optimal performance as well as optimal utilisation and the fastest throughput speed possible.

Cloud: Nothing special

Both Lock and Buchanan agree that there is nothing special about the cloud. In Buchanan’s view it’s just one more choice that’s available to CIOs within their global strategy. “From a data movement perspective the fact remains that whatever strategy is chosen with regards to public, private or hybrid cloud, the underlying and fundamental problem remains – that being how to get your data to and from whichever location you have chosen without impediment”, she explains.

She adds that IT is under pressure to deliver a myriad of initiatives, whether that is cloud or big data, IoT or digital transformation: “Couple that with the data deluge that we are experiencing as shown by IDC’s prediction that there will be 40 ZB of data by 2020, and so there is a high mountain to climb.” For this reason she argues that organisations need to find smart ways to do things. This is crucial if organisations are going to be able to deliver better and more efficient services over the years to come. It’s time for new approaches to old problems.

Become smarter

Most of the innovation is coming from SMEs and not large corporate enterprises.  “Small companies are doing really clever things that flip old and established problems on their heads, and this kind of innovation only really comes from SMEs that are focused on specific issues – and as we all saw in 2008 with Lehman Brothers long gone are the days when being big meant you were safe”, she argues.

She therefore concludes that CFOs and CIO should look at SCION solutions such as WANrockIT from several angles such as cost optimisation by doing more with their existing pipes. Connectivity expansion should only occur if it’s absolutely necessary. With machine intelligence it’s possible to reduce staffing costs too because SCIONs require no manual intervention. SCION technology can enable organisations to locate their datacentres for co-location or cloud anywhere without being unhinged by the negative effects of network latency.

In fact a recent test by Bridgeworks involving 4 x 10Gb connections showed that the data was moving at 4.4GB per second, equating to 264GB per minute or 15,840GB per hour. So SCIONs open up a number of opportunities for CFOs and CIOs to support. In essence they will gain better service at a lower cost. However, Lock concludes that CFOs should not investigate this kind of proposition alone. The involvement of IT is essential to ensure that business and service expectations are met from day one of the implementation of these technologies. Yet by working together, CFOs and CIOs will be able to accelerate cloud storage by mitigating latency.