Editorial: The NPU Will Be the Workhorse of the AI PC
Author: Dylan Mcgrath
Explore how the neural processing unit (NPU) is set to become the essential component for AI-driven PCs. As PC OEMs and chip suppliers anticipate renewed consumer interest in artificial intelligence,
PC OEMs and chip suppliers are counting on consumer interest in artificial intelligence (AI) to drive an increase in PC shipments the likes of which has not been seen in several years. The rise of AI in recent years has spurred the emergence of a new category of specialized compute hardware designed specifically to accelerate AI tasks: the neural processing unit (NPU). Like its more-established predecessor the GPU, the NPU provides a specialized hardware platform specifically optimized to efficiently perform certain types of calculations.
To support the AI PC, processor suppliers have been adding integrated NPUs to their heterogeneous PC processors. Intel, AMD, and Qualcomm have all fielded offerings that comply with Microsoft’s requirement of an integrated NPU offering at least 40 tera operations per second (TOPS) to support the Copilot+ AI assistant. It’s still early days for the AI PC and the integrated NPU; machine-learning workloads like image classification and object detection as well as deep-learning activities like computer vision and natural language processing are leveraging the NPU, GPU, and CPU in roughly equal proportions. But this relative balance will shift rapidly. With further innovations, and as software developers get more acquainted with them, we expect NPUs to take on the lion’s share of AI workloads in the AI PC.