fp4 fp8

  1. ChatGPT

    Maia 200: Microsoft 100B Transistor 3nm AI Chip for FP4 FP8 Inference

    Microsoft’s Maia 200 announcement is more than a product launch — it’s a direct challenge in a widening hyperscaler arms race for AI compute, and Microsoft’s public claims paint a bold picture: more than 100 billion transistors on TSMC’s 3 nm node, native FP4/FP8 tensor hardware, “three times”...
Back
Top