All posts by billbalmer

Virtualised encryption: How it could be the killer app for NFV – and help with GDPR too

When it comes to meeting the new requirements of the EU’s General Data Protection Regulation (GDPR), cloud users could well have the advantage. That may sound counter-intuitive; as one of the biggest hurdles facing companies that have migrated to the cloud is exposure of data, since the perimeter is no longer a firewall at the company edge.

GDPR is set to take effect on May 25, 2018. The end goal of this legislation is to strengthen and enforce the security of personal data.  This new legal framework dictates that any company doing business in the EU, with EU citizens or that is EU-based will be held liable for any breach of data. GDPR regulation impacts virtually any company across the globe that does business in Europe, and it will likely become the new de facto standard for the care and management of customer data moving forward.

Fines for not adhering to the requirements can cost up to 20m EUR (around $24m) or 2-4% of annual global revenues, whichever is larger. The mitigating circumstance is that a company will not be held liable if they have encrypted their data in motion and it is then hacked. Let me repeat: securing data that goes into the cloud through encryption will be a critical piece of GDPR compliance.

Ultimately, the clear majority of businesses worldwide will now need to make sure their networks are encrypted. In recent weeks, major players such as Amazon and Microsoft have all indicated they will endeavor to comply with the new directive.

So, what can cloud providers do to tackle GDPR compliance?  The answer is network virtualised services.

Traditionally service providers would deploy appliances specifically built for each function such as firewalls and routers. For encryption, appliances often provide basic encryption capability for slow speed links. For higher speed or better performance, specialised hardware is necessary, but this approach adds significant cost to the equipment and requires an end-point that is vendor specific to match the proprietary hardware encryption.

Low-cost universal customer premises equipment (uCPE) allows service providers to remotely setup software services such as encryption, firewalls, and routers with zero touch provisioning. These virtualised network functions (VNF) can be downloaded and configured remotely. The uCPE is an off-the-shelf server provided by the service provider or customer. Using off-the-shelf devices over specific built appliances will drastically reduce the cost of premises equipment as will the use of software functions instead of appliances because one server can operate many different functions.

Relying on apps to handle encryption is not enough; as recently as 2016, a study by Blue Coat Systems, Inc. showed the 98% of the 15,000 apps surveyed would not meet GDPR requirements. Even if the vendors or developers claim that their products do, how does an IT department test this premise? Instead of relying on appliance manufacturers to protect data in motion, it is now possible for specialised security companies to create virtualised encryption VNFs to add to the service chain of functions. Service providers can remotely deploy virtualised encryptors to protect data-in-motion from the client's site right to the server in a data centre. Encrypting the stream to the cloud was a recommendation given at the RSA conference in San Francisco earlier this year.

Virtualisation speaks to the issue of scaling, be it a small company with limited IT resources that would have to outsource the project or a large corporation with many branch sites to address. Scaling the project based on cost is covered by the step-and-repeat process of virtualisation. Time to deploy is also a scaling issue. With zero-touch provisioning, it’s possible to automate the process so that downloading the software to commissioning can take as little as 30 minutes.

Most cloud deployments are not limited to a single cloud application. A recent RightScale State of the Cloud Report suggests that 85% of enterprises prefer this approach. With virtualised encryption, it’s possible to now encrypt in a hybrid cloud with different flows going to various cloud providers. All of the encryption can be managed by a unified key system, which can be controlled by either the service provider and/or the customer.

And the quality of the protection? While IPSec has been the past standard for data transport, it provides poor performance in a software form. IPSec is a framework of open standards that has traditionally offered security to tunnel between VPN endpoints at the IP layer. Standardised in 1995, it is noted for its complexity, intensive CPU requirements and latency. Many companies have tried to compensate for this with specialised IPSec proprietary hardware. While hardware improves performance, it does force vendor dependency. Now, however, we no longer have to rely on this standard and instead can use 21st Century government-grade software solutions. The best part is that the cost is a fraction of hardware encryption.

Agility, scaling, and flexibility are the three tenants of network functions virtualisation. Virtualised encryption covers all these points. Maybe this is the NFV killer app that service providers have been looking for to create new revenue-bearing services. In the ever-changing threat environment for data-in-motion, virtualised encryptors, through the nature of software templates, offer modern techniques for upgrading technology and scaling to meet new and changing customer requirements, thus offering the best means of ensuring GDPR compliance.

Why the data protection world is a vampire – and what to do about it

“The world is a vampire.” The words to the Smashing Pumpkins song kept coming to mind as I listened to the discourse of a leading tier one carrier CISO.

I was in New York City attending one of the technical seminars that constantly come up projecting what the new New Thing is going to be. This rendition was on the insecure world of data flowing across networks. I mused that this time there was some fire to go along with the smoke. Snowden had just come on to the world stage and shone light on all that NSA and other nation states could and were doing to gain access to sensitive data, both personal and commercial.

To those of us in the network transmission business, the idea of securing network connections is nothing new. For many years we’ve used industry standard techniques to secure connections as data travels across insecure networks. The disturbing revelations coming out of the Snowden documents were not that data had been hacked but that the tools we had come to rely on for so many years could no longer be guaranteed safe.

Adding to the alarm was the issue that we were moving into the cloud and the traditional castle perimeter of the firewall would no longer exist. In fact, according to the speaker they had become nothing more than yellow police tape around a crime scene.

The cloud offers a lot in savings when it comes to reducing CAPEX, but the trade-off is the OPEX component

I have to admit I don’t know any other lyrics from the song. I just remember it from the TV series Whale Wars. But, one thing’s for sure: the world has become a vampire when it comes to data in flight.

Security for transmitted data is a concern as old as the internet. The name branding changes; it used to be data-in-motion and now it’s data in flight. I guess data-in-flight sounds more perilous. Even back in the days of the granddaddy of the internet, ARPANET, Bob Metcalfe, the inventor of Ethernet, famously predicted that the growth of connectivity would make security a serious concern. As Metcalfe’s law states, “the value of a telecommunications network is proportional to the square of the number of connected users of the system.” Any CISO that surveys the migration to the cloud will agree.

In the traditional packet network, two standards have been the stalwart tools for data in flight.

Netscape came up with HTTPS as a means to protect the application layer in the early 90s. Originally developed by Netscape for Webpage protection, the technology has evolved from SSL implementations to TSL standards for greater security. Apple announced last year at their World Wide Developer’s conference a requirement that all apps sold on their App Store needed TSLv 1.2.

Born in the 90s, IPsec has become the go-to solution for many data use cases. It’s the cornerstone security for both in VPNs and TSL/SSL. Designed from the beginning to work in the packet network TCP/IP environment, IPsec became the de facto standard for network encryption.

About 10 years ago, a Layer 2 equivalent of IPsec called MACsec was developed with a higher quality of security and better utilization of bandwidth, requiring only about 10% of the bandwidth. MACsec limitations of encrypting and decrypting at each network switch severely limited its use. The benefits though were significant enough that specialized hardware-based Layer 2 encryptors became popular with Fortune 500 companies and governments.

Whether the need is for cloud compute or storage, the cloud has a wide diversity of utility. Ultimately it is a virtual overlay to a technology or business infrastructure. From a network transport perspective, the cloud can offer significant benefits in agility and cost savings. The trade-off is that now many touch points exist outside the traditional designs of firewalls or private networks.

How do traditional security techniques fit into this new technology? Starting at the top of the stack, HTTPS still has a substantial data protection role to play. Applications operate above the network and data link layers. Apps are often the interface to the cloud services. There are a few serious issues with HTTPS.

Whether the need is for cloud compute or storage, the cloud has a wide diversity of utility

At the recent RSA conference in San Francisco, Dave Shackleford of the SANS Institute pointed out that security is now being designed into apps. Software developers are not always going to be knowledgeable about building high-quality security. This seems to be borne out in the 2016 Shadow Data Report which states that less than 95% of cloud applications are not SOC-2 compliant and a full 96% do not meet the General Data Protection Regulation (GDPR) guidelines. A typical company believes that it has 30-40 cloud apps operating when in fact it has 841.

IPsec has a bigger struggle in the cloud environment. The cloud offers a lot in savings when it comes to reducing CAPEX, but the trade-off is the OPEX component. Bandwidth, such as VPNs are expensive and IPsec will require between 40% and 60% of bandwidth. Latency is another issue with IPsec, although some vendors have moved it into hardware implementation to speed up the processing.

MACsec has evolved to fit nicely into the cloud. Improvements that allow end-to-end connectivity now address its limitations. With very low bandwidth requirements along with low latency, MACsec has become the best method for encrypting across a network. One innovative company, Senetas, has taken its hardware-based Layer 2 encryptor and virtualized it. Network function virtualization (NFV) can now make high-quality, low-cost encryption solutions possible.

How to drive a stake into the heart of the vampire? Encrypt as low in the protocol stack as possible. Take advantage of the new overlay network topology of NFV and use modern techniques to protect data. Encrypting at Layer 3 using IPsec is no longer necessary. A properly designed NFV implementation can take advantage of data center technologies like VXLAN to use MACsec for both Layer 2 and Layer 3 protection. And, by moving away from IPsec, the cost savings in VPN connections can make a significant contribution to upgrading to a virtualized network.