Architects' Tech Alliance
Architects' Tech Alliance
Mar 1, 2024 · Industry Insights

Why HBM3E Is Set to Power the Next AI Server Boom

HBM, a vertically stacked DRAM technology, is evolving to HBM3E with up to 8 Gbps speed and 16 GB capacity, driving explosive growth in AI server demand, reshaping market shares among SK Hynix, Samsung and Micron, and relying on CoWoS and TSV packaging advances.

AI serversCoWoSHBM
0 likes · 8 min read
Why HBM3E Is Set to Power the Next AI Server Boom
Architects' Tech Alliance
Architects' Tech Alliance
Feb 1, 2024 · Industry Insights

Why HBM3E Is Set to Power the Next AI Server Boom

The article explains how High Bandwidth Memory (HBM) technology has evolved to HBM3E, details its technical advantages, outlines the rapid growth of AI server shipments, projects a $15 billion HBM market by 2025, and analyzes the competitive landscape of major suppliers and packaging methods.

AI serversCoWoSHBM
0 likes · 9 min read
Why HBM3E Is Set to Power the Next AI Server Boom
Architects' Tech Alliance
Architects' Tech Alliance
Aug 7, 2023 · Industry Insights

How Taiwan’s ODMs Are Powering the Global AI Server Boom

The article analyzes Taiwan’s mature ICT ecosystem, its dominant share in global server OEM/ODM production, the surge in AI compute demand driven by large language models, and how companies like Wistron, Quanta and Inventec are positioned to benefit from expanding AI server and CoWoS markets.

AI serversCloud capexCoWoS
0 likes · 12 min read
How Taiwan’s ODMs Are Powering the Global AI Server Boom