AWS offloads Alexa processing to the cloud


Bobby Hellard

26 Nov, 2019

AWS has enabled its voice control services to be available on lower-powered devices by offloading the majority of the work to the cloud.

Alexa Voice Services (AVS) is already widely used with its Echo smart speakers and other devices that can be connected to a network or the internet, such as lightbulbs and TVs.

Adding voice controls was costly as Alexa devices had a minimum requirement of at least 100 megabytes of on-device RAM and an ARM Cortex “A” class microprocessor to have enough processing power to handle voice commands. 

That’s no longer the case as the tech giant will use its cloud to handle most of the processing requirements with Alexa Voice Services for IoT, reducing the costs of voice control by up to 50%. The baseline requirement has now been reduced to 1MB of RAM and Arm Cortex M-class microcontrollers.

The move also means that retrieving, buffering and decoding on devices will also be offloaded to its cloud. As such, everything from light switches to thermostats can now be controlled entirely using voice with AVS for IoT.

“We now offload the vast majority of all of this to the cloud,” AWS IoT VP Dirk Didascalou told TechCrunch. “So the device can be ultra dumb. The only thing that the device still needs to do is wake word detection. That still needs to be covered on the device.

“It just opens up the what we call the real ambient intelligence and ambient computing space,” he said. “Because now you don’t need to identify where’s my hub – you just speak to your environment and your environment can interact with you. I think that’s a massive step towards this ambient intelligence via Alexa.”

The cloud giant made a number of IoT announcements aimed at simplifying IoT services for companies deploying large swathes of devices. It revealed added features to AWS IoT Greengrass, for example, which has been given capabilities for Docker. This extends AWS functions to connected devices, allowing businesses to perform data collection and analysis at the edge. The update is with Docker containers, which make it easier to move compute workloads to and from the edge.