Microsoft’s unveiling of the Maia 200 AI accelerator and its companion system marks a deliberate push by a major cloud vendor into the hardware space—and it could reshape how telcos deploy AI at the edge and in their core networks. The new silicon promises large memory capacity, a fabric built...
Microsoft’s announcement of the Maia 200 marks a decisive escalation in the hyperscaler chip wars: a second‑generation, inference‑first accelerator Microsoft says is built on TSMC’s 3 nm process, packed with massive on‑package memory and a new Ethernet‑based scale‑up fabric — and already being...
Microsoft is rolling Copilot Vision into Windows — a permissioned, session‑based capability that lets the Copilot app “see” one or two app windows or a shared desktop region and provide contextual, step‑by‑step help, highlights that point to UI elements, and multimodal responses (voice or typed)...
Microsoft’s Maia 200 launch is a statement: the company is betting its future inference stack on in‑house accelerators and Ethernet-based scale-up, and Wall Street is already parsing winners and losers — with Wells Fargo naming Marvell (MRVL) and Arista Networks (ANET) as likely beneficiaries in...