Worldwide AI chips market is on the rise, given the demand from various organizations focusing on GenAI solutions. Revenue from AI semiconductors globally is expected to total $71 billion in 2024, an increase of 33% from 2023, according to the latest forecast from Gartner, Inc.
“Today, generative AI (GenAI) is fueling demand for high-performance AI chips in data centers. In 2024, the value of AI accelerators used in servers, which offload data processing from microprocessors, will total $21 billion, and increase to $33 billion by 2028,” said Alan Priestley, VP Analyst at Gartner.
Gartner forecasts AI PC shipments will reach 22% of the total PC shipments in 2024, and by the end of 2026, 100% of enterprise PC purchases will be an AI PC. AI PCs include a neural processing unit (NPU) enabling AI PCs to run longer, quieter and cooler and have AI tasks running continually in the background, creating new opportunities for leveraging AI in everyday activities.
While AI semiconductor revenue will continue to experience double-digit growth through the forecast period, 2024 will experience the highest growth rate during that period (see Table 1).
Table 1. AI Semiconductor Revenue Forecast, Worldwide, 2023-2025 (Millions of U.S. Dollars)
2023 | 2024 | 2025 | |
Revenue ($M) | 53,662 | 71,252 | 91,955 |
Source: Gartner (May 2024)
From Compute Electronics to Record Highest Share in Electronic Equipment Segment
In 2024, AI chips revenue from compute electronics is projected to total $33.4 billion, which will account for 47% of total AI semiconductors revenue. AI chips revenue from automotive electronics is expected to reach $7.1 billion, and $1.8 billion from consumer electronics in 2024.
Fierce Battle Between Semiconductor Vendors and Tech Companies
While much of the focus is on the use of high-performance graphics processing units (GPUs) for new AI workloads, the major hyperscalers (AWS, Google, Meta and Microsoft) are all investing in developing their own chips optimized for AI. While chip development is expensive, using custom designed chips can improve operational efficiencies, reduce the costs of delivering AI-based services to users, and lower costs for users to access new AI-based applications. “As the market shifts from development to deployment we expect to see this trend continue,” said Priestley.
Leave a Reply