AI PCs to Account for Nearly 60 Percent of Shipments by 2027

AI PCs to Account for Nearly 60 Percent of Shipments by 2027
Depositphotos

A new forecast from IDC shows shipments of AI PCs growing from nearly 50 million units in 2024 to more than 167 million in 2027. By the end of the forecast, IDC expects AI PCs will represent nearly 60% of all PC shipments worldwide.

"As we enter a new year, the hype around generative AI has reached a fever pitch, and the PC industry is running fast to capitalize on the expected benefits of bringing AI capabilities down from the cloud to the client," said Tom Mainelli, group vice president, Devices and Consumer Research. "Promises around enhanced user productivity via faster performance, plus lower inferencing costs, and the benefit of on-device privacy and security, have driven strong IT decision-maker interest in AI PCs. In 2024, we'll see AI PC shipments begin to ramp, and over the next few years, we expect the technology to move from niche to a majority."

Until recently, running an AI task locally on a PC was done on the central processing unit (CPU), the graphics processing unit (GPU), or a combination of the two. However, this can hurt the PC's performance and battery life because these chips are not optimized to run AI efficiently. PC silicon vendors have now introduced AI-specific silicon to their SoCs called neural processing units (NPUs) that run these tasks more efficiently.

While shipments of hardware-enabled AI PCs will ramp up quickly over the next two years, next-generation AI PCs are forecast to dominate the market by the end of the forecast. IDC believes shipments of Next-generation AI PCs will be double that of hardware-enabled AI PCs by 2027. Many of these AI PCs will be sold to commercial buyers, but consumers will have much to look forward to in the coming AI PC age, including potential improvements in PC gaming and digital content creation.

There are three primary technical reasons for bringing AI workloads from the cloud to the client: to enhance performance by eliminating the round trip that current AI workloads must make to the cloud and back over the network; to enhance privacy and security by keeping data on the device versus in motion; and to lower the cost by limiting the need to access costly cloud resources.