It’s no secret that the AI pedal industry is flourishing right now, with semiconductor suppliers launching new neural control units and the Artificial Desktop program launching more powerful processors into laptops, desktops, and workstations.
According to a study conducted by Gartner on the AI chip industry, global AI chip revenue is projected to increase by 33 % in 2024. Especially, the Gartner review” Forecast Analysis: AI Semiconductors, Worldwide” detailed contest between hyperscalers ( some of whom are developing their own cards and calling on semiconductor suppliers ), the use cases for AI cards, and the demand for on- chip Artificial startups.
” Longer name, AI- based applications may walk out of data centers into PCs, smartphones, border and terminal devices”, wrote Gartner scientist Alan Priestley in the document.
What happens to all these AI cards?
Gartner predicted total AI chips revenue in 2024 to be$ 71.3 billion (up from$ 53.7 billion in 2023 ) and increasing to$ 92 billion in 2025. Of total AI chips revenue, computer electronics will likely account for$ 33.4 billion in 2024, or 47 % of all AI chips revenue. Other sources for AI chips revenue will be automotive electronics ($ 7.1 billion ) and consumer electronics ($ 1.8 billion ).
Of the$ 71.3 billion in AI semiconductor revenue in 2024, most will come from discrete and integrated application processes, discrete GPUs and microprocessors for compute, as opposed to embedded microprocessors.
In terms of application revenue from AI semiconductors in 2024, the majority of it may come from assess technology, wired contacts electronics, and automotive electronics.
Gartner noticed a change in assess requirements between the first instruction of the AI design and inference, which is the revision of everything the AI model has learned during training. By 2028, according to Gartner, over 80 % of the workload accelerators in data centers will be used to perform AI inference workloads, an increase of 40 % from 2023.
SEE: Microsoft’s new category of PCs, Copilot+, will use Qualcomm processors to run AI on- device.
Workload accelerators and AI walk hand-in-hand.
AI accelerators in servers will be a$ 21 billion industry in 2024, Gartner predicted.
” Today, generative AI ( GenAI ) is fueling demand for high- performance AI chips in data centers. In 2024, the value of AI accelerators used in servers, which offload data processing from microprocessors, will total$ 21 billion, and increase to$ 33 billion by 2028″, said Priestley in a press release.
AI workloads will require beefing up standard microprocessing units, too, Gartner predicted.
In a May 4 forecast analysis of AI semiconductors around the world, Priestley wrote that “many of these AI-enabled applications can be executed on standard microprocessing units ( MPUs ), and MPU vendors are extending their processor architectures with dedicated on-chip AI accelerators to better handle these processing tasks.”
In addition, the rise of AI techniques in data center applications will drive demand for workload accelerators, with 25 % of new servers predicted to have workload accelerators in 2028, compared to 10 % in 2023.
The emergence of the AI PC?
Gartner is optimistic about the push to run large-scale language models locally on laptops, workstations, and desktops. According to Gartner, AI PCs have a neural processing unit that enables users to use AI for “everyday activities.”
By 2026, the analyst firm predicted that every enterprise PC purchase would be an AI PC. Although it’s not yet known whether this will actually be the case, hyperscalers are undoubtedly incorporating AI into their upcoming devices.
AI among hyperscalers encourages both competition and collaboration
AWS, Google, Meta and Microsoft are pursuing in- house AI chips today, while also seeking hardware from NVIDIA, AMD, Qualcomm, IBM, Intel and more. For instance, Microsoft and Apple are looking to add OpenAI products to their hardware, while Dell announced a number of new laptops that run AI on Qualcomm’s Snapdragon X Series processor. Gartner anticipates that there will be more custom-designed AI chips developed.
According to Gartner analyst Gaurav Gupta, hyperscalers are developing their own chips to better control their product roadmaps, control costs, cut down on their reliance on off-the-shelf chips, leverage IP synergies, and maximize performance for their specific workloads.
” Semiconductor chip foundries, such as TSMC and Samsung, have given tech companies access to cutting- edge manufacturing processes”, Gupta said.
Additionally, he claimed that” Arm and other firms, like Synopsys, have given access to advanced intellectual property that makes custom chip design relatively simple.” Additionally, easy access to the cloud and a shifting culture of semiconductor assembly and test service ( SATS ) providers have made it simpler for hyperscalers to enter chip design.
” While chip development is expensive, using custom designed chips can improve operational efficiencies, reduce the costs of delivering AI- based services to users, and lower costs for users to access new AI- based applications”, Gartner wrote in a press release.