Microsoft’s Answer to High AI Costs with Dual Chip Design


Microsoft’s dual-chip design is a unique way to lower the high AI costs of processing

Along with other major tech companies, Microsoft unveiled two specially designed processor chips on Wednesday to reduce the high cost of providing artificial intelligence (AI) services. Microsoft stated that instead of selling the chips, it will utilize them to power its cloud computing service Azure as well as its subscription software products.

Microsoft unveiled a new chip, named Maia, at its Ignite developer conference in Seattle. Its purpose is to accelerate AI computing operations and serve as the basis for its $30/month (€28) “Copilot” service, which is intended for business software customers and developers who wish to create custom AI services.

Large language models, a class of AI software that powers Microsoft’s Azure OpenAI service and is the result of Microsoft’s partnership with OpenAI, the company that created ChatGPT, were intended to run on the Maia processor. The high costs of supplying AI services, which can be ten times larger than traditional services like search engines, are a challenge for Microsoft and other tech giants like Alphabet.

According to Microsoft executives, the company’s extensive efforts to incorporate artificial intelligence (AI) into its products would be streamlined by using a shared set of basic models to save associated expenses. They claimed that the Maia chip is tailored for that kind of task.

“We believe that this offers us a means to offer our clients better solutions that are quicker, less expensive, and of higher quality,” stated Scott Guthrie, executive vice president of Microsoft’s cloud and artificial intelligence division.

Microsoft also announced that starting in the upcoming year, cloud services powered by the newest flagship CPUs from Advanced Micro Devices and Nvidia will be made available to Azure users. “Nvidia is not being replaced by this,” stated Ben Bajarin, CEO of Creative Strategies, an analysis group.

According to him, Microsoft will be able to offer AI services in the cloud using the Maia chip until smartphones and personal computers are capable of handling them. “Microsoft has a very different kind of core opportunity here because they’re making a lot of money per user for the services,” Bajarin stated.

Microsoft’s second chip is intended to be an internal cost-cutting measure as well as a countermeasure against Amazon Web Services, the company’s main cloud competition. Known by the name Cobalt, the new chip is an arm-holding-powered CPU. Microsoft said on Wednesday that Cobalt, the messaging app for businesses, has already been put through testing.

However, Microsoft’s Guthrie stated that to rival Amazon’s “Graviton” line of proprietary chips, his business also wishes to offer direct access to Cobalt. “We are designing our Cobalt solution to ensure that we are very competitive both in terms of performance as well as price-to-performance,” added Guthrie.

Microsoft provided minimal technical information that would allow one to compare the chips’ competitiveness with that of more established chipmakers. Taiwan Semiconductor Manufacturing Co.’s 5-nanometer manufacturing technology powers both the hardware and infrastructure of Azure, according to Rani Borkar, corporate vice president for both divisions.

In contrast to the more costly bespoke Nvidia networking technology that Microsoft employed in the supercomputers it constructed for OpenAI, he said that the Maia processor will be connected via regular Ethernet network wiring. We’re going to be much more standardized, as stated by Borkar.

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Browse Tags