Qualcomm Unveils Power Efficient AI Accelerator Cloud AI 100
Qualcomm announced that it is bringing the company’s AI expertise to the cloud with the Qualcomm Cloud AI 100. It utilizes the heritage in advanced signal processing and power efficiency.
“Today, Qualcomm Snapdragon mobile platforms bring leading AI acceleration to over a billion client devices. Our all new Cloud AI 100 accelerator will significantly raise the bar for the AI inference processing relative to any combination of CPUs, GPUs, and/or FPGAs used in today’s data centers,” said Keith Kressin, senior vice president, product management, Qualcomm Technologies. “Furthermore, the company is now well positioned to support complete cloud-to-edge AI solutions all connected with high-speed and low-latency 5G connectivity.”
In addition, Qualcomm is supporting developers with a full stack of tools and frameworks for each of our cloud-to-edge AI solutions. Facilitating the development of the ecosystem in this distributed AI model will help enhance a myriad of potential experiences for the end-user, including personal assistants for natural language processing and translations, advanced image search, and personalized content and recommendations.
The Cloud AI 100 offers more than 10x performance per watt over the industry’s most advanced AI inference solutions deployed today, all new, highly efficient chip specifically designed for processing AI inference workloads and 7nm process node bringing further performance and power advantages. The suport will be available support industry leading software stacks, including PyTorch, Glow, TensorFlow, Keras, and ONNX.
“Microsoft’s vision of cloud-to-edge AI emphasizes the benefits of distributed intelligence,” said Venky Veeraraghavan, partner group program manager, Microsoft Azure. “Collaboration continues between Qualcomm Technologies and Microsoft in many areas.” The Qualcomm Cloud AI 100 is expected to begin sampling to customers in the second half of 2019.