On November 15th, Microsoft officially introduced Azure Maia 100, one of the most advanced AI chips based on the 5-nanometer process with 105 billion transistors.
According to Rani Borkar, Vice President of Microsoft Corporation, the company is testing how Maia 100 can handle training AI chatbots on the Copilot search tool, the new name for Bing Chat, the GitHub Copilot encoding assistant, and even GPT-3.5-Turbo, a large language model from OpenAI (LLM).
Currently, Microsoft and OpenAI are encouraging businesses to use AI generative models. Last month, Microsoft CEO Satya Nadella stated that this field is experiencing significant growth. GitHub Copilot alone saw a 40% increase in the third quarter of 2023 compared to the previous quarter.
“We have more than one million paid Copilot users in over 37,000 organizations and businesses. This number is expanding significantly, especially outside of the United States.”
- Satya Nadella, CEO of Microsoft
In addition to introducing Azure Maia 100, Microsoft also unveiled the Cobalt 100 chip for versatile cloud computing, which can compete with Intel processors. This new chip is based on the ARM architecture, 64-bit, contains 128 computing cores, and offers high performance while reducing power consumption by 40% compared to other ARM chips used in Azure cloud systems. Cobalt is currently used in processing cloud software systems, including Microsoft Teams and Azure SQL.
Apart from Microsoft, other major competitors in the industry are making significant strides in developing AI chips, such as Nvidia. Nvidia currently leads the market with its GPU H100 Tensor Core chip, widely used in the AI industry.
Earlier this week, Nvidia introduced the H200 chip, the world’s most powerful AI chip. According to the company, when tested with Meta’s Llama 2 large language model with 70 billion parameters, the H200 delivered nearly double the performance compared to the previous H100 chip. Amazon Web Services, Google Cloud, Microsoft Azure, and Oracle are among the first cloud computing platforms to equip H200 when the product launches next year.
However, the shortage of Nvidia chips on the market presents significant opportunities for other tech companies to boost their chip production. Reports in October suggested that OpenAI might start producing in-house chips. Meta announced details about its next-generation AI chips in May. In August, Google unveiled a chip called Cloud TPUv5e for use in AI.