New GPU engine in the on-device AI framework delivers comprehensive GPU and NPU support across Android, iOS, macOS, Windows, ...
Today, we’re proud to introduce Maia 200, a breakthrough inference accelerator engineered to dramatically improve the ...
Maia 200 is most efficient inference system Microsoft has ever deployed, with 30% better performance per dollar than latest ...
Application error: a client-side exception has occurred (see the browser console for more information).
Microsoft recently announced Maia 200, a new AI accelerator specifically designed for inference workloads. According to ...
Maia 200 packs 140+ billion transistors, 216 GB of HBM3E, and a massive 272 MB of on-chip SRAM to tackle the efficiency crisis in real-time inference. Hyperscalers prioritiz ...
Calling it the highest performance chip of any custom cloud accelerator, the company says Maia is optimized for AI inference on multiple models.
Microsoft unveils the Maia 200 AI chip. Learn about the tech giant's shift toward in-house silicon, its performance edge over ...
Remote work continues to open doors nationwide. These 10 high-paying work-from-home jobs start at $65 an hour, combined with savings from skipping commutes.
Microsoft officially launches its own AI chip, Maia 200, designed to boost performance per dollar and power large-scale AI ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results