Microsoft’s Maia 200 announcement has triggered a new chapter in the hyperscaler silicon race: the chip’s memory-first architecture and Microsoft’s reported decision to source HBM3E exclusively from SK hynix have immediate technical, commercial, and geopolitical ripple effects for AI...
SK hynix’s reported role as the exclusive supplier of HBM3E for Microsoft’s new Maia 200 accelerator is a consequential development for the AI hardware supply chain — if it’s true. Industry reporting from Korea says Microsoft’s Maia 200 will integrate six 12‑layer HBM3E stacks (216 GB total)...
Microsoft’s revelation that its Maia 200 inference accelerator pairs a mammoth 216 GB of on‑package HBM3E with the claim that SK hynix is the exclusive supplier has sent shockwaves through the AI memory market and escalated the Korea‑based rivalry over high‑performance HBM for hyperscaler ASICs...