The race for artificial intelligence (AI) and the cloud continues to be well and truly joined. AI, alongside the cloud, can be seen as two of the technologies that, in tandem, will power the next cycle of business. As VMware CEO Pat Gelsinger put it last year: cloud enables mobile connectivity, mobile creates more data, more data makes AI better, AI enables more edge use cases, and more edge means more cloud is needed to store the data and do the computing.
As this publication has frequently argued, for those at the sharp end of the cloud infrastructure market, AI, along with blockchain, quantum and edge to name three more, are the next wave of cloud services and where the new battle lines are being drawn. Yet there is a new paradigm afoot.
Qualcomm went into the fray last week with the launch of the Qualcomm Cloud AI 100. “Built from the ground up to meet the explosive demand for AI inference processing in the cloud, the Qualcomm Cloud AI 100 utilises the company’s heritage in advanced signal processing and power efficiency,” the press materials blazed. “With this introduction, Qualcomm Technologies facilitates distributed intelligence from the cloud to the client edge and all points in between.”
While the last dozen or so words in that statement may have seemed like the key takeaway, it is the power efficiency side which makes most sense. Where that is Qualcomm’s heritage, in terms of using its technology to power millions of smartphones, it does not have the same impact when it comes to the data centre. In December, the company announced it would lay off almost 270 staff, confirming it was ‘reducing investments’ in the data centre business.
Its competition in this field, chiefly Intel but also NVIDIA, is particularly strong. Yet Kevin Krewell, principal analyst at Tirias Research, told Light Reading last week that “to fit more easily into existing rack servers, new inference cards need to be low power and compact in size.” This, therefore, is where Qualcomm sees its opportunity.
With Cloud AI 100, Qualcomm promises a more than 10 times greater performance per watt over the industry’s most advanced AI inference solutions deployed today, and a chip ‘specifically designed for processing AI inference workloads.’
“Our all-new Qualcomm Cloud AI 100 accelerator will significantly raise the bar for the AI inference processing relative to any combination of CPUs, GPUs, and/or FPGAs used in today’s data centres,” said Keith Kressin, Qualcomm SVP product management. “Furthermore, Qualcomm Technologies is now well positioned to support complete cloud-to-edge AI solutions all connected with high speed and low-latency 5G connectivity.”
Crucially, this is an area where cooperation, rather than competition, with the big cloud infrastructure providers may be key. Microsoft was unveiled as a partner, with the two companies’ visions similar and collaboration continuing ‘in many areas.’
Writing for this publication in November, Dr. Wanli Min, chief machine intelligence scientist at Alibaba Cloud, noted how this rise was evolutionary rather than revolutionary. “For many organisations it has been a seamless integration from existing systems, with AI investment gathering pace quickly,” he wrote. “Over the next few years we can expect to see the industry continue to boom, with AI driving cloud computing to new heights, while the cloud industry helps bring the benefits of AI to the mainstream.”
Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.