Use of public cloud continues to grow. In fact, 84% of businesses had placed additional workloads into the public cloud in 2018, according to a recent report by Dimension Research. Almost a quarter of those (21%) reported that their increase in public cloud workloads was significant.
However, while respondents were almost unanimous (99%) in their belief that cloud visibility is vital to operational control, only 20% of respondents said they were able to access the data they need to monitor public clouds accurately.
“If there’s any part of your business – including your network – which you can’t see, then you can’t determine how it’s performing or if it is exposing your business to risks such as poor user experience or security compromise,” points out Scott Register, vice president, product management at Ixia, the commissioner of the report.
This sounds like a major issue and yet surprisingly, it’s nothing new. Tony Lock, distinguished analyst and director of engagement at Freeform Dynamics, has been reporting on visibility issues for over five years, and not just regarding public cloud.
“Believe it or not despite people having had IT monitoring technology almost since IT began, we still don’t have good visibility in a lot of systems,” he tells us. “Now we’re getting so much more data thrown at us, visibility is even more of a challenge – just trying to work out what’s important through all of the noise.”
He adds that for many years public cloud providers have been slow to improve their services and make it easier for organisations to see what’s happening, largely because they handled it all for them.
“To a degree, you can understand why [providers] didn’t focus on monitoring to begin with, as they’ve got their own internal monitoring systems and they were looking after everything. But if a customer is going to use them for years and years then they want to see what’s in there, how it’s being used and if it’s secure.”
The cost of zero visibility
A lack of visibility in the public cloud is a business risk in terms of security, compliance and governance, but it can also affect business costs. For example, companies may be unaware that they’re paying for idle virtual machines unnecessarily.
Then there’s performance. Almost half of those that responded to Ixia’s survey stated that a lack of visibility has led to application performance issues. These blind spots hide clues key to identifying the root cause of a performance issue, and can also lead to inaccurate fixes.
Another issue relates to legal requirements and data protection. With a lack of visibility, some businesses may not be aware that they have customer information in the public cloud, which is a problem when “the local regulations and laws state it should not be stored outside of a company’s domain”, highlights Lock.
Then there are the complexities around data protection and where the liability sits should a data breach occur.
“Often a daisy chain of different companies is involved in cloud storage, with standard terms and conditions of business, which exclude liability,” explains BCS Internet Specialist Group committee member, Peter Lewinton. “This can leave the organisation that collected the data [being] liable for the failure of a key supplier somewhere in the chain – sometimes without understanding that this is the position. This applies to all forms of cloud storage, but there’s less control with the public cloud.”
Understandably, security continues to be a big concern for enterprises. The majority (87%) of those questioned by Ixia said they’re concerned that their lack of visibility obscures security threats, but it’s also worth noting that general security concerns regarding public cloud still apply.
What’s the solution?
Lock believes that things are changing and vendors are beginning to listen to the concerns of customers. Vendors have started to make more APIs available and several third-party vendors are also creating software that can run inside virtualised environments to feed back more information to customers. “This move is partly down to customer pressure and partly down to market maturity,” he notes.
Ixia’s Scott Register recommends either a physical or virtual network tap that effectively mirrors traffic on a network segment or physical interface to a downstream device for monitoring.
“These taps are often interconnected with specialised monitoring gear such as network packet brokers, which can provide advanced processing, such as aggregation, decryption, filtering and granular access controls. Once the relevant packets are available, hundreds of vendors offer specialised tools that use the packet data for application or network performance monitoring as well as security analytics.”
Are vendors really to blame?
Although many businesses suffer with poor public cloud visibility, Owain Williams, technical lead at Vouchercloud, believes customers are too quick to blame the provider. He argues that there are many reliable vendors already providing the necessary access tools and that a lack of visibility is often down to the customer.
“This is my experience. As such it’s often entirely solvable from the business. The main providers already give you the tools you need. Businesses can log every single byte going in and out if they wish – new folders, permission changes, alerts; all the bells and whistles. If the tools themselves are inefficient, then businesses need to re-evaluate their cloud provider.
Instead, he believes that many of the visibility problems that businesses encounter can be traced back to those managing infrastructure – employees that may be in need of extra training and support.
“Better education for people – those charged with provisioning the infrastructure – is a strong first port of call,” he argues. “It’s about ensuring the businesses and individuals have the right training and experience to make the most of their public cloud service. The tools already exist to assure visibility is as robust as possible – it’s provided by these large public cloud organisations. Invariably, it’s a case of properly identifying and utilising these tools.”