www.hostingjournalist.com - HostingJournalist.com
HostingJournalist.com V3.0 Has Been Launched. List Your Business and Start Publishing Today. Free 14 Day Trial. SHOW ME

Surge in AI Server Shipments Forecasted for 2023, According to TrendForce

PublishedJune 12, 2023

News Summary

A new wave of AI-server expansion is on the horizon for 2023, according to market research firm TrendForce.

Join HostingJournalist Insider Today

Surge in AI Server Shipments Forecasted for 2023, According to TrendForce

servers - data center

A new wave of AI-server expansion is on the horizon for 2023, according to market research firm TrendForce. Global shipments of AI servers are forecasted to experience an extraordinary 38.4% year-over-year growth, bringing the total to an estimated 1.2 million units worldwide. This projected leap echoes the escalating demand for AI servers and chips, projected to account for approximately 9% of all server shipments. By 2026, AI servers are set to constitute a hefty 15% of the total.

The progressive momentum in AI server shipments will be largely driven by these tech powerhouses equipped with GPUs, FPGAs, and ASICs, crafted to service a diverse range of markets. TrendForce has amplified its Compound Annual Growth Rate (CAGR) forecast for AI server shipments from 2022 to 2026, now projecting an ambitious 22%. Simultaneously, the AI chip market is set to witness a phenomenal increase of 46% in 2023.

At the forefront of the AI server market stands NVIDIA, brandishing a dominant 60-70% market share. Its rival, ASIC chips, independently developed by Cloud Service Providers (CSPs), is not far behind, grabbing more than 20% of the market share. NVIDIA's triumphant reign can be attributed to three factors:

  1. The high demand for NVIDIA’s A100 and A800 models among American and Chinese CSPs. The demand curve is set to gradually skew towards the newer H100 and H800 models in the latter half of 2023. With a selling price 2-2.5 times higher than the A100 and A800, these models boast an enhanced appeal. NVIDIA is investing heavily in marketing strategies, promoting its machine solutions like DGX and HGX to enhance their appeal.
  2. The lucrative nature of high-end GPUs, especially the A100 and H100 models. TrendForce's research indicates that NVIDIA’s superior positioning in the GPU market enables them to charge variable prices for the H100 model, leading to significant differences in price—nearly US$5,000 depending on the volume purchased.
  3. The sustained popularity and expanding influence of AI computations and ChatBOTs across professional sectors. These include cloud and e-commerce services, intelligent manufacturing, financial insurance, smart healthcare, and advanced driver-assistance systems. The escalating demand for AI servers - specifically those equipped with 4-8 GPUs for cloud-based servers and 2-4 GPUs for edge AI servers - is being met with an impressive 50% annual growth rate in shipments of AI servers sporting NVIDIA’s A100 and H100 models.

High Bandwidth Memory Demand

TrendForce has also revealed that High Bandwidth Memory (HBM), a high-speed RAM interface used in advanced GPUs, is anticipated to see a significant surge in demand. NVIDIA's upcoming H100 GPU, to be released this year, is equipped with the faster HBM3 technology, an upgrade over the previous HBM2e standard. This advancement would enhance the computation performance of AI server systems.

As the demand for top-tier GPUs, including NVIDIA’s A100 and H100, AMD’s MI200 and MI300, and Google’s proprietary TPU, continues to soar, TrendForce anticipates a staggering 58% YoY increase in HBM demand for 2023, with a further 30% increase projected in 2024.








Follow HostingJournalist

Other Channels