Pushing cloud AI closer to the edge


Keri Allan

12 Apr, 2018

Cloud-based AI services continue to grow in popularity, thanks to their low cost, easy-to-use integration and potential to create complex services.

In the words of Daniel Hulme, senior research associate at UCL, «cloud-based solutions are cheaper, more flexible and more secure» than anything else on the market.

By 2020 it’s believed that as many as 60% of personal technology device vendors will be using third-party AI cloud services to enhance the features they offer in their products. However, we’re also likely to see a significant growth of cloud-based AI services in the business sector.

One of the biggest drivers of this has been the proliferation of VPAs in the consumer space, made popular by the development of smart speakers by the likes of Amazon and Google.

Users have quickly adopted the technology into their everyday lives, and businesses were quick to realise the potential locked away in these devices, particularly when it comes to delivering new products.

Drivers of cloud-based AI services

Amazon’s Alexa was the first personal assistant to achieve mass-market appeal

«It’s a confluence of factors,» says Philip Carnelley, AVP Enterprise Software Group at analyst firm IDC. «There is no doubt the consumer experience of using Alexa, Siri and Google Now has helped familiarise businesses with the power of AI.

«But there is also a lot of publicity around AI achievements, like DeepMind’s game-winning efforts – AlphaGo winning against the Go champion for example – or Microsoft’s breakthrough efforts in speech recognition.

He adds that improvements to the underlying platforms, such as the greater availability of infrastructure-as-a-service (IaaS) and new developments in graphical processing units, are making the whole package more cost-effective.

Yet, it’s important to remember that despite there being so much activity in the sector, the technology is still in its infancy.

«AI is still very much a developing market,» says Alan Priestley, research director for technology and service partners at Gartner. «We’re in the very early stages. People are currently building and training AI models, or algorithms, to attempt to do what the human brain does, which is analyse natural content.»

The likes of Google, Amazon and Facebook are leading this early development precisely because they have so much untapped data at their disposal, he adds.

The role of the cloud

Vendors have helped drive AI concepts thanks to open source code

The cloud has become an integral part of this development, primarily because of the vast computing resources at a company’s disposal.

«The hyper-scale vendors have all invested heavily in this and are building application programming interfaces (APIs) to enable themselves – and others – to use services in the cloud that leverage AI capabilities,» says Priestley.

«By virtue of their huge amount of captive compute resource, data and software skill set, [these vendors have been] instrumental in turning some of the AI concepts into reality.»

This includes the development of a host of open source tools that the wider community is using today, including TensorFlow and MXNet, and large vendor services are frequently being utilised when training AI models.

According to IDC, businesses are already seeing the value of deploying these cloud-based AI solutions. Although less than 10% of European companies use AI in operational systems today, three times that amount are currently experimenting with, piloting or planning AI usage – whether that be to improve sales and marketing, planning and scheduling, or general efficiency.

Benefits to business

Chatbots were an early AI hit within many businesses

«Businesses are seeing early implementations that show how AI-driven solutions, like chatbots, can improve the customer experience and thereby grow businesses – so others want to follow suit,» says Carnelley.

«Unsurprisingly, companies offering AI products and services are growing fast,» he points out.

Indeed, chatbots were one of the earliest AI-powered features to break into the enterprise sphere, and interest looks set to continue.

According to a report published this month by IT company Spiceworks, within the next 12 months, 40% of large businesses expect to implement one or more intelligent assistants or AI chatbots on company-owned devices. They will be joined by 25% mid-sized companies and 27% of small businesses.

However, organisations are also looking more widely at the many ways AI solutions could help them.

The insurance industry, in particular, is looking at how AI can be used to help predict credit scores and how someone may respond to a premium.

«This is not just making a decision but interpreting the data,» says Priestley. «A lot of this wasn’t originally in digital form, but completed by hand. This has been scanned and stored but until recently it was impossible for computer systems to utilise this information. Now, with AI, technology can extract this data and use it to inform decisions.»

Another example he highlights is the medical sector, which is deploying AI-powered systems to help improve the process of capturing and analysing patient data.

«At the moment, MRI and CT scans are interpreted by a human, but there’s a lot of work underfoot to apply AI algorithms that improve the interpretation of these images, and diagnosis (via AI),» says Priestley.

Moving to the edge

Self-driving cars will need latency-free analytics

Given the sheer amount of computational power on hand, the development of AI services is almost exclusively taking place in the cloud but, looking forward, experts believe that many will, at least partially, move to the edge.

The latency associated with the cloud will soon become a problem, especially as more devices require intelligent services that are capable of analysing data and delivering information in real time.

«If I’m in a self-driving car it cannot wait to contact the cloud before making a decision on what to do,» says Priestley. «A lot of inferencing will take place in the cloud, but an increasingly large amount of AI deployment will take place in edge devices.

«They’ll still have a cloud connection, but the workload will be distributed between the two, with much of the initial work done at the edge. When the device itself can’t make a decision, it will connect to the ‘higher authority’ – in the form of the cloud – to look at the information and help it make a decision.»

Essentially, organisations will use the cloud for what it’s good at – scale, training and developing APIs and storing data. Yet it’s clear that the future of cloud-only AI is coming to an end.

Image: Shutterstock