Azure Maia: Microsoft has built custom AI chip for cloud infrastructure

3 Min Read

Watch: What Microsoft CEO Satya Nadella said about the India vs New Zealand world cup semi-final matchAs AI applications become more sophisticated, tech giants are looking to boost the performance of their infrastructure to train large language models (LLMs) in a cost-effective manner. One of the ways to bring costs down is to avoid costly reliance on third-party chipmakers. Earlier this year, Google launched Cloud TPU v5e to handle increasing workloads like generative AI and LLMs, and now, Microsoft has unveiled two custom-designed chips.

At Microsoft Ignite 2023, the company unveiled the Microsoft Azure Maia 100 AI Accelerator, optimised for artificial intelligence (AI) tasks and generative AI, and the Microsoft Azure Cobalt CPU, an Arm-based processor to run general purpose compute workloads on the Microsoft Cloud.

“The chips represent a last puzzle piece for Microsoft to deliver infrastructure systems – which include everything from silicon choices, software and servers to racks and cooling systems – that have been designed from top to bottom and can be optimised with internal and customer workloads in mind,” the company said.

Microsoft custom AI chips

Microsoft announced that the chips will start to roll out early next year to Microsoft’s data centers. They will initially power the company’s services such as Microsoft Copilot or Azure OpenAI Service. Later, the chips will be available for the company’s industry partners for use in their systems.

“Microsoft is building the infrastructure to support AI innovation, and we are reimagining every aspect of our datac enters to meet the needs of our customers,” said Scott Guthrie, executive vice president of Microsoft’s Cloud + AI Group.

The chips have “billions of transistors that process the vast streams of ones and zeros flowing through data centers.” Microsoft says that the new chips are tailored for Microsoft cloud and AI workloads and will work with the company’s custom software.

Rani Borkar, corporate vice president for Azure Hardware Systems and Infrastructure (AHSI) noted that the company’s goal is to establish an Azure hardware system that offers maximum flexibility and can also be optimised for power, performance, sustainability or cost.

With its own chips, Microsoft will now have the flexibility to move aways from Nvidia’s in-demand Nvidia’s H100 GPUs.

Share This Article
By admin
test bio
Leave a comment