Google demos real-time speech to text translation


Nicole Kobie

29 Jan, 2020

Google has demonstrated a real time translation and transcription tool powered by AI, that will take lectures and other long-form voice in one language and output it in another. 

Part of Google Translate, the tool will let a smartphone act as interpreter, listening to speech via the microphone and transcribing translated text in real time. So far, the system only supports a few languages, including English, French, German and Spanish. The demo showed English being translated to Spanish. 

“With this, your Android mobile phone will effectively turn into an almost real time translator device for long-form speech,” Google said at the demonstration according to reports, adding it could “unlock continuous speech translations in this world at scale” in the longer term.

However, the transcription and translation won’t happen on your device. Instead, the audio recording will be uploaded to Google’s servers – so you’ll need a decent Wi-Fi connection or solid data package, as well as a willingness to share the audio with Google, to use this tool. 

Google didn’t say when the feature would arrive. 

The Google Translate tool was unveiled as part of a showcase of Google’s AI projects at its San Francisco office, which also included a neural network used to track and monitor whales using underwater microphones, which is now being used in Canada.  

“With this information, marine mammal managers can monitor and treat whales that are injured, sick or distressed,” said Julie Cattiau, product manager for Google AI, in a blog post. “In case of an oil spill, the detection system can allow experts to locate the animals and use specialised equipment to alter the direction of travel of the orcas to prevent exposure.” 

That machine-learning system was trained on 1,800 hours of underwater audio recordings that were labelled and supplied by Fisheries and Oceans Canada. 

The AI demonstrations come a week after Google CEO Sundar Pichai wrote a column urging sensible regulation of artificial intelligence to avoid misuse of the technology. 

Google developing all in one messaging app for business


Bobby Hellard

29 Jan, 2020

Google is reportedly gearing up to take on Slack and Microsoft Teams with its own business communications app.

The app will combine its raft of G Suite services, including Gmail, Google Drive, Hangouts Meet and Hangouts Chat, into a single mobile entity, according to The Information, which cites two people who have used it.

A prototype of the app is currently being tested internally at Google, with the CEO of its cloud division, Thomas Kurian, reportedly sharing details about the project with cloud salespeople and business partners.

The tech giant announced a range of AI and communication-based features at its Cloud Next conference in November, including voice assistant capabilities for parts of G Suite, smart compose functions for Google Docs and a new video chat feature on Gmail.

More recently, it demonstrated a real-time translation feature for Android that will allow users to hear one language and read it in another. 

If the reports are true, the new comms app could see Google in the middle of Slack and Microsoft’s on-going battle for business communications.

Microsoft Teams is currently the more popular service with 13 million active users, according to figures released in July 2019. However, the numbers are boosted by companies using Office 365, of which teams is an integral part. It’s worth noting that Google’s rumoured new app is said to be part of its G Suite service and so could fulfil a similar role.

Slack is also hugely popular, particularly with startups, where it has 10 million daily users according to the latest figures. Despite falling behind Microsoft Teams, the company has called out the tech giant on a few occasions.

In November, Slack tweeted a video suggesting a Teams advert had copied one of its own and captioned the tweet “ok boomer”. The company’s CEO Stewart Butterfield has also criticised Microsoft for its “surprisingly unsportsmanlike” behaviour bundling Teams into Office 365.

Cisco WebEx will use voice tools to exploit ‘next frontier’ of data insights


Keumars Afifi-Sabet

29 Jan, 2020

Cisco’s flagship collaboration suite, WebEx Meetings, will introduce a voice assistant alongside transcription and translation tools to help customers mine insights from the voice data collected by the platform.

The added tools, powered by artificial intelligence (AI), will allow teams using the platform to streamline the entire meetings process, from automating aspects like minute-taking, and recording actionable items.

In addition to an Alexa-like voice assistant, real-time transcription tools will combine with advanced analytics so businesses can derive insights from the voice data that’s collected internally during all meetings.

This technology, which Cisco adopted following its Voicea acquisition in September last year, generates a word cloud following each meeting, and automates clips and videos that can serve as packaged highlights. The system, moreover, can be trained to learn corporate taxonomy specific to each business.

“Voicea users have reported saving more than six hours per week per user with more actionable and efficient meetings, and we believe Webex users will experience similar results,” said VP and GM for team collaboration at Cisco, Sri Srinivasan.

“We’re excited to bring this and other cognitive features to the 300 million users we already serve with Cisco Collaboration. This technology will fundamentally change how we are able to deliver massively personalised experiences and transform the way we work.”

The company says the added automation, particularly on action points and highlights that can be distributed to absentees or attendees to serve as reminders, will lead to more productive meetings overall. The engine can also determine information such as what the meeting was about and even the tone of the meeting.

The addition of aspects like live closed-captioning, meanwhile, will allow people to tune into meetings remotely while they’re in busy or noisy environments. All users, meanwhile, will be able to read back through a transcript that’s automatically generated to recap on certain points and even search this for particular information.

Cisco’s Webex assistant and voice tools feed into the company’s idea of “cognitive collaboration”, which essentially amounts to giving customers the right information, at the right time and in the right context. This is, as the company sees it, paramount to giving businesses the ability to gain insights from a new pool of data that’s been relatively untapped.

“We think of transcription and the capabilities around it … as the next frontier as information as a currency,” Srinivasan added. “It is the next data quorum – the largest data quorum – that we can create at Cisco collaboration.

“And when you think about 300 million users across calling, across meetings, starting to bring this information forward, we are able to bring that information – that data – and convert that into information. So there’s a number of capability sets that come; for example, analytics at the end of the meeting.”

The firm also moved to quash any concerns from businesses over whether Cisco might itself gather their voice data, suggesting that users can set their own privacy controls, and that all its services are GDPR-compliant. All data is also end-to-end encrypted, which customers can either manage themselves or give Cisco the keys to.

Avast expands opt-out after data-sharing investigation


Nicole Kobie

28 Jan, 2020

Avast has been caught up in yet another privacy scandal, with a joint investigation by PC Mag and Motherboard revealing the extent to which the security firm is collecting user browser histories and selling the data on to third parties. 

Last year, Avast browser extensions were spotted collecting browsing data to sell to advertising firms, sparking Chrome, Opera and Firefox to pull the add-ons from their marketplaces, though some have since returned.

Avast said at the time that it removed any identifying information from the browsing history. The PC Mag and Motherboard investigation suggested it’s possible to re-identify that data once it’s in the hands of marketers. 

The investigation revealed that Avast sells the collected data via its Jumpshot division to third parties such as marketing companies. The browsing history being collected includes every click, keyword search, and entered URLs, harvested not only from browser extensions but also from users of Avast’s free antivirus software. 

The collected data is “de-identified” by stripping out personal details, and tagged with an identifying code. However, research casts doubt on whether any large sample of user data can be truly anonymised. Jumpshot’s data does not directly identify any specific individual, but when it is combined with other data, it’s simple to see who is clicking what, the investigation claims. 

For example, if a data harvesting company or marketer bought data from Avast and also from a website you’re logged into (for example Amazon), the information provided would make it possible to link the Avast data to your Amazon account, therefore revealing your identity, and tying it to your entire browsing history. The data seen by the investigators includes searches, GPS coordinates on maps, visits to social media accounts, and even what video was watched on a porn site. 

The investigation showed Jumpshot was selling that data to companies that aggregate such information, with customers buying access to that “all clicks feed” for millions of dollars. 

Avast stopped sharing such data collected via extensions after the revelations last year, and in July 2019 started asking users for permission before sharing their browsing data with Jumpshot. It will now also ask all existing users of its free antivirus to opt-in to data sharing in February. 

An Avast spokesperson said the company stopped sharing browser extension data with Jumpshot in December, only using collected information for core security tasks.

“We ensure that Jumpshot does not acquire personal identification information, including name, email address or contact details,” the spokesperson added.

Avast also noted that users have always had the ability to opt out of such data sharing: “As of July 2019, we had already begun implementing an explicit opt-in choice for all new downloads of our AV, and we are now also prompting our existing free users to make an opt-in or opt-out choice, a process which will be completed in February 2020.”

The spokesperson added: “We have a long track record of protecting users’ devices and data against malware, and we understand and take seriously the responsibility to balance user privacy with the necessary use of data for our core security products.”

This isn’t the first data privacy scandal to hit Avast: in 2018, Avast pulled an update to its CCleaner tool over data collection concerns

What can companies learn from object storage pioneers?


Lindsay Clark

28 Jan, 2020

The shift to the cloud is encouraging enterprises to rethink their options on storage. According to a June 2019 study from IHS Markit, 56% of organisations said they plan to increase investment in object storage, putting it ahead of unified storage at 51%, storage-area networks at 48% and network-attached storage at 36%. Most object storage is in the cloud, with popular examples including AWS S3, Azure Blob Storage and Google Cloud Platform (GCP) Cloud Storage.

But shifting to a new storage architecture at the same time as the cloud move is not entirely painless.

At the beginning of the decade, Moneysupermarket.com, the consumer online comparison and information site for financial services, was using a combination of SQL databases and SAS analytics environment. By 2014, it had moved to AWS for website hosting and data analytics, including use of S3 object storage and Vertica data warehouse. By May 2019, it moved its data and analytics to GCP using the BigQuery data warehouse and Cloud Storage object storage. The website itself remains on AWS.

Harvinder Atwal, Chief Data Officer at MoneySuperMarket, tells Cloud Pro: “One of the good things about the cloud is the initial learning curve is very shallow: it’s easy to start. But then you get to the point where it’s very much steeper and you need to understand some of the complexities involved.”

One example of those complexities is the introduction of object lifecycle policies. The idea is to define policies to manage objects throughout the time the organisation needs them. That might be to move them to cheap long-term storage such as AWS Glacier or to expire them all together. Getting these rules right from the outset can save costs.

“That’s one of the things that maybe we should put a little more effort into from the very beginning,” Atwal says.

Other advice for those moving to object storage in the cloud includes avoiding biting off more than the team can chew.

“I would not do the migration all in one go,” Atwal says. “I think the bigger project and the more money and resources it uses, the more likely it is to fail. I would encourage people to think of their use case and application and build a minimal viable product around that.”

It’s worth getting advice about the transition from independent third parties, which the cloud platform vendors can recommend. For example, Moneysupermarket.com used a consultancy called DataTonic with its transition to Google Cloud Platform.

Lastly, there can be a cultural change in store for the IT department, Atwal says. “The IT function can be very traditional in its thinking around how you use data. They think you must cleanse it, put it into a relational schema and only then can users access it. But with data today, the value in analytics comes from actually being able to use data for many sources and join them together, and IT has to learn to ditch its historic mindsets.”

Nasdaq, the tech stock market, began working with AWS in 2012. It stores market, trade and risk data on the platform using S3 and Glacier. It uploads raw data to Amazon S3 throughout the trading day, using a separate system running in the cloud, converts raw data into Parquet files and places them in their final S3 location. This way, the system is able to elastically scale to meet the demands of market fluctuations. It also uses Amazon Redshift Spectrum to query data to support billing and reporting, and Presto and Spark on Elastic MapReduce (EMR) or Athena for analytics and research.

“Migrating to Amazon S3 as the ‘source of truth’ means we’re able to scale data ingest as needed as well as scale the read side using separate query clusters for transparent billing to internal business units,” says Nate Sammons, assistant vice president and lead cloud architect at Nasdaq.

But getting the scale of analytics solutions right for the problem has been a challenge, he says. “We currently operate one of the largest Redshift clusters anywhere, but it’s soon to be retired in favour of smaller purpose-specific clusters. Some of the custom technologies we developed [in the early days] have since been retired as cloud services have matured. Had technologies like Amazon Redshift Spectrum existed when we started, we would have gone straight to Amazon S3 to start with, but that was not an option.”

The advantage of using S3, though, was that it made the organisation less concerned about individual machine outages or data centre failures, Sammons says. “If one of the Amazon Redshift Spectrum query clusters fail, we can just start another one in its place without losing data. We don’t have to do any cluster re-sizing and we don’t require any CPU activity on the query clusters to do data ingest.”

Rahul Gupta, IT transformation expert at PA Consulting, says those exploiting object storage in the cloud should know that apparent scalability and elasticity does not remove the need to do some basic housekeeping on data.

“A lot of people feel storage is cheap, so they build systems with vast amounts of data and think the impact on cost is not that great. They push the data into S3, or an equivalent, and then once it’s in there, they feel that they can impose structure on the data, which is not the right thing to do,” he says.

He says that by understanding data structure upfront and creating governance such as role-based access, organisations will not have to revisit the architecture once the data grows.

Just because so many organisations are moving storage to the cloud, does not mean they all get the same value from the transition. The considerable investment cloud infrastructure, storage and analytics application will offer the greatest returns to those who understand the storage lifecycle upfront, create some governance rules around access and understand data structure from the outset.

More sensitive data moves to the enterprise cloud – but the security risk widens with it

Enterprises continue to feed their clouds with increasingly sensitive information, yet according to McAfee’s latest report the security issues are building alongside this trend.

The study, titled ‘Enterprise Supernova: The Data Dispersion Cloud Adoption and Risk Report’, polled 1,000 enterprises across 11 countries, as well as logging anonymous data from 30 million enterprise cloud users.

More than a quarter (26%) of the files analysed in the cloud now contain sensitive data, a rise of 23% year over year. Yet security is yet to catch up. 91% of cloud services analysed do not encrypt data at rest, while one in five respondents said they lacked visibility into what data resides in their cloud applications.

Enterprises are utilising initiatives such as data loss protection (DLP); indeed, on average companies polled saw more than 45,000 incidents per month. Yet only a third (37%) say they are utilising DLP. Almost four in five (79%) of those polled said they allowed access to enterprise-approved cloud services from personal devices. A quarter of companies admitted they had sensitive data downloaded from the cloud to an unmanaged personal device.

This is the situation at many organisations and is, to not put too fine a point on it, a mess. “Security and risk management professionals are left with a patchwork of controls at the device, network, and cloud – with significant gaps in visibility to their data,” the report noted. “Living with these gaps and the patchwork of security born out of the network is an open invitation to breach attempts and non-compliance.”

93% of CISOs surveyed do agree that it is their responsibility to secure data in the cloud. Three in 10 respondents, however, admit they lack the staff with the skills to secure their SaaS applications. The latter figure is up 33% from the year before, with the report noting technology and training continues to be outpaced by cloud’s aggressive enterprise growth.

“The force of the cloud is unstoppable, and the dispersion of data creates new opportunities for both growth and risk,” said Rajiv Gupta, senior vice president for cloud security at McAfee. “Security that is data-centric, creating a spectrum of controls from the device, through the web, into the cloud, and within the cloud provides the opportunity to break the paradigm of yesterday’s network-centric protection that is not sufficient for today’s cloud-first needs.”

McAfee’s recent reports have been a mix of the gloom-laden and optimistic. In November, the company noted how 40% of large UK businesses expected to be cloud-only by 2021, but noted the gaps in security and responsibility between the haves and have-nots. In June, the company’s Cloud and Risk Adoption Report, again based on the responses of 1,000 enterprises, found organisations with cloud access security brokers (CASBs) were over 35% more likely to launch new products and gain quicker time to market.

The company had three recommendations based on its current report findings:

  • Evaluate your data protection strategy for devices and the cloud: Consider the difference between a disparate set of technologies at each control point and the advantages of merging them for a single set of policies, workflows, and results
  • Investigate the breadth and risk of shadow IT: Determine your scope of cloud use, with a focus on high-risk services; then move to enabling your approved services and restricting access to those which might put data at risk
  • Plan for the future of unified security for your data: Context about devices improves security of data in the cloud, and context about the risk of cloud services improves access policy through the web. Many more efficiencies apply, while some are yet to be discovered. These control points are merging to deliver the future of data security

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

NHS shifts two national services to the cloud


Nicole Kobie

28 Jan, 2020

The NHS has migrated two of its national services to the cloud, hoping to cut costs while improving security and efficiency of services.

The NHS e-Referral Service (e-RS) and NHS 111 Directory of Services (DoS) are the first major NHS systems to make the migration under the government’s cloud-first policy, with both using AWS.

The aim is to cut costs at both services, but there are other benefits, says Neil Bennett, director of services at NHS Digital. “Costs are lowered, reducing pressure on the public purse, there is better security and reliability, as well as greater flexibility, performance, scalability and availability, to name a few,” he said.

The e-RS is a booking service that handles 18 million referrals annually, letting patients from more than 1,100 GP practices choose clinics, hospitals and dates and times for appointments. It’s now enabled for booking and managing such appointments via the internet, but that won’t be available until later this year when NHS Identity takes over authentication, the NHS said in a statement.

The DoS helps connect patients to the appropriate service for the health concerns, helping relieve pressure on urgent and emergency care. It handles 16 million searches annually.

Migrating such important services without disrupting patient care was key, explained Bennett: “This was a tremendous collaborative effort across many different teams here and with external partners, to migrate such large systems with a minimum of disruption to users, in a reasonably short timescale.” 

Alongside cutting costs and improving services for patients, the migration to cloud is a key part of the NHS’ sustainability strategy, said Ben Tongue, sustainability manager at NHS Digital.

“Large cloud operators like AWS provide significant energy and carbon savings against enterprise and legacy systems,” he said. “We are working with AWS to achieve full transparency on the energy use and carbon impact of the contract, so that we can continue to focus on ensuring that our storage systems are as energy efficient as possible, reducing carbon emissions and minimising environmental impact.”

Last year, the NHS unveiled a cloud framework to simplify procurement, hoping to help migrate more services as part of the government’s cloud-first policy. Patient records stored by EMIS are already making the shift to AWS, while Barts Health NHS Trust is moving its IT estate to the cloud via Capgemini, but research suggests many NHS trusts remain wary of the cloud.

Data centre M&A broke the 100 deal barrier in 2019 – driven by private equity

2019 saw a bumper year of data centre merger and acquisition deals with the total number passing 100 for the first time, according to Synergy Research.

The record number has come about following a dramatic swing in private versus public deals, with 50% in private equity and a 45% downturn for sales closed by public companies.

The number of billion-dollar deals declined again in 2019, with 2017 remaining the benchmark for deal value due to three multi-billion-dollar transactions and a further three rated at over a billion dollars.

One deal which did not quite make the end-of-year cut was Digital Realty’s proposed blockbuster acquisition of Interxion for $8.4 billion. The biggest data centre deal of all time is expected to close later this year, with no closer schedule noted. Aside from that, Synergy noted that Digital Realty and Equinix, the two largest colocation providers by market share, have been ‘by far’ the largest investors over the past five years.

The significant rise in private equity deals can be seen as evidence of the importance of prime data centre space, according to John Dinsdale, a chief analyst at Synergy Research. “The aggressive growth of cloud services and outsourcing trends more generally are fuelling a drive for scale and geographic reach among data centre operators, which in turn is stimulating data centre M&A activities,” said Dinsdale. “This has been attracting an ever-increasing level of private equity activity as investors seek to benefit from high-value and strategically important data centre assets.

“It is also notable that even the biggest publicly traded data centre operators are increasingly turning to joint ventures with external investors to help fund growth and protect balance sheets,” Dinsdale added.

The private deals, although receiving fewer headlines generally, have comprised 57% of deal volume since 2015, according to Synergy’s figures. Among those in the past year have included AMP Capital’s acquisition of US data centre firm Expedient in October, and Shagang Group acquiring the remaining 24% stake of Global Switch for £1.8 billion in September.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

CloudKnox raises $12 million in funding to further continuous cloud security mission

CloudKnox, a provider of identity authorisation for hybrid and multi-cloud environments, has secured $12 million (£9.17m) in a funding round to accelerate product and go-to-market plans.

The company has a cloud security offering based around continuous decision making, monitoring, adapting and responding to identity and access management (IAM) risks in real-time. Its intriguingly-named Privilege Creep Index (PCI) as part of its dashboard helps organisations assess and improve their risk posture.

CloudKnox has an established partnership with Amazon Web Services (AWS), as an advanced technology partner, as well as with VMware. The company announced in August the launch of its cloud security platform for the hybrid VMware Cloud on AWS offering.

This makes for interesting reading when compared with recent usage research; according to a study from AllCloud – another primarily AWS-centric partner – almost three quarters of enterprise private workloads analysed from 150 IT decision makers were using VMware, with the trend set to increase.

“We’ve seen exceptional growth from customers and prospects looking to address the number one risk in their cloud infrastructure,” said Balaji Parimi, CloudKnox CEO and founder in a statement. “This positioned us to pre-emptively secure another round of funding to leverage strong market adoption and accelerate our customer expansion.”

Among the cast to join CloudKnox’s board are Stephen Ward, CISO at The Home Depot and Suresh Batchu, co-founder and CTO at enterprise mobility management (EMM) provider MobileIron. Ward noted that CloudKnox had a ‘compelling’ vision around continuous detection and proactive measurement for cloud security.

The round was led by Sorenson Ventures with participation from various early investors, including ClearSky Security, Dell Technologies Capital and Foundation Capital.

Elsewhere Sysdig, a provider of secure operations for DevOps environments, has raised $70 million in series E funding. The company includes Goldman Sachs among its customers – VP merchant banking Soumya Rajamani sits on Sysdig’s board – and aims to help enterprises remove doubt over their Kubernetes deployments.

https://www.cybersecuritycloudexpo.com/wp-content/uploads/2018/09/cyber-security-world-series-1.pngInterested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.

Managing the cloud money pit


Lindsay Clark

24 Jan, 2020

Despite the growing popularity of cloud computing, organisations still struggle to control costs. Research from Flexera, a provider of IT management software, shows 84% of enterprises find optimising cloud costs a growing challenge. The 2019 survey of 786 technology professionals also found organisations underestimate their wastage, putting it at 27% of cloud spending whereas the real figure — according to Flexera — is around 35%.

Part of the problem is the shift from the old world to the new, as organisations lift more of their applications and infrastructure to the cloud, according to Adrian Bradley, cloud advisory lead at KPMG

“With on-premise contracting, you get a set of unit prices and service levels, you negotiate with a provider, and then that will be a one-off procurement exercise that lasts three to five years. You set the value within that initial negotiation, and everyone can go home and leave it to relatively junior people to execute the contract because the unit prices have protection,” he tells Cloud Pro.

In the cloud, however, decision-making can often be handed down to junior developers and infrastructure managers, but pricing can be very dynamic with complex discount arrangements. “The consequence is that the cost of cloud has actually been more than expected,” Bradley explains.

The challenge of controlling costs comes in two parts. Firstly, can organisations exploit the standard price structures of the big cloud vendors to help them get better value for money? And secondly, can organisations try to negotiate their own ‘special’ deals from cloud vendors, and get better value than the standard price structures offer?

On the first question, KPMG’s Bradley says mature organisations are finding ways to get more bang for their bucks.

In AWS, for example, prices are mostly baked into reserved instances, where users commit to a certain level of compute for a certain period, with incremental discounts in line with how much they commit.

“Mature users on cloud have become quite sophisticated in planning effectively around that,” he says.

Organisations can also get better deals depending on when they reserve computing power. “There’s an element of arbitrage, because it’s a bit like looking for holidays. If you book early, you get a good deal.”

But, in a similar vein, there are bargains to be had in last-minute deals in cloud computing spot markets via companies such as Spotinst and Cloudability, Bradley says.

“And just like a holiday, if you book late and you’re unfussy about where you go, then you can also get a great deal. That’s not something that’s part of the initial negotiation, and what you have to work out is how you can best make use of the economic models the cloud providers have created,” he says.

The second question addresses whether organisations can negotiate away from the standard price structure. It’s possible, but only for the world’s largest corporations such as multinational consumer goods firms and banks, Bradley says.

“If you hit that threshold of scale and you’re talking about really substantial workloads, then you can have a specific negotiation. That does get you a little bit further,” he says.

To lower prices further still, large businesses can propose creative deals with cloud providers. For example, BP, the oil and energy company, has agreed to supply AWS with 170 MW of renewable energy as part of its cloud computing contract.

But to negotiate, organisations have to prepare. Their chances of success depend as much on the measures they put in place internally as they do on their approach to suppliers.

Mike Jette, industry lead for telecoms, media and technology at procurement outsourcing firm GEP says: “In the early days, it was like the wild west. In a lot of organisations, tons of different people were buying cloud services in an uncoordinated fashion, trying to align with their strategic objectives with very little structured governance or procurement. It was just a lot of people trying to say, ‘hey look I moved to the cloud’.”

How organisations manage their cloud consumption is half the challenge in getting more value, he says. “You need to be thoughtful on the buy side, but the management side is really important to maintaining costs and getting value out of the service providers.”

This means understanding how much the organisation is consuming, and how that might vary, he says. “To get leverage [with suppliers] you have to have management and controls in place. The early adopters have gone through this exercise and they’ve taken 30%-plus of the cost out.”

If they go to market with enough volume, there is always room to negotiate, he says. “You need to have a sense of what the estate looks like and where it’s going to grow to, but there’s definitely an opportunity to negotiate. The cloud service providers like to talk about their market share: they’re in the business of buying volume now,” he says.

To get the best deals from suppliers, organisations need to understand and predict the volumes they will require. It can be a thankless task and even goes against some of the advantages of cloud computing, says Matt Yonkovit, chief experience officer at Percona, an independent consulting company that helps move open source databases to the cloud.

Although organisations can create guardrails to try to guide developers to certain platform providers and solutions, many still want the freedom to choose. Meanwhile, the cloud providers offer so many services — as many as 180 from AWS for example – each with a separate pricing structure, that estimates of demand are often inaccurate, he says.

While there are machine learning tools that can help, some organisations want to burst out workloads to support business demand: ecommerce companies supporting Christmas shopping, for example.

Just as important as forecasting demand is to ensure applications and databases are configured to the cloud environment and minimise consumption, he says. “People don’t understand the shared responsibility model, and that causes most of the extra spend. Understand the technology and optimising system can reduce costs.”

The big three cloud providers – which command more than half the market between — may have the upper hand in negotiating with customers. But buyers are strengthening their position by better understand and controlling their demand, exploiting spot markets, and better configuring their technology. Excelling in these areas will build value for cloud buyers.