khaskhabar.com : Thursday, November 16, 2023 11:06 AM
San Francisco. Accelerating the AI race, Microsoft has unveiled two in-house, custom-designed chips and integrated systems that can be used to train large language models.
The Microsoft Azure Maya AI Accelerator is optimized for artificial intelligence (AI) tasks and generative AI, and the Microsoft Azure Cobalt CPU, an Arm-based processor, is designed to run general-purpose compute workloads on the Microsoft cloud.
The chips will start arriving in Microsoft’s data centers early next year, initially powering the company’s services like Microsoft CoPilot or Azure OpenAI Service, the company said at its “Microsoft Ignite” event late Wednesday.
“Microsoft is building the infrastructure to support AI innovation, and we’re reimagining every aspect of our data centers to meet our customers’ needs,” said Scott Guthrie, executive vice president of Microsoft’s Cloud Plus AI Group. Imagining from.”
Microsoft sees the addition of in-house chips as a way to ensure that every element is designed for Microsoft cloud and AI workloads.
The ultimate goal is an Azure hardware system that provides maximum flexibility and can also be optimized for power, performance, stability or cost,” said Rani Borkar, corporate vice president, Azure Hardware Systems and Infrastructure (AHSI).
“Software is our core strength, but frankly, we are a systems company. Borkar said, at Microsoft we are designing and optimizing hardware and software together, so that one plus one is greater than two.
“We have visibility of the entire stack and silicon is just one material,” he adds.
At Microsoft Ignite, the company also announced the general availability of one of those key ingredients: Azure Boost, a system that makes storage and networking faster by moving those processes from host servers to purpose-built hardware and software.
To complement its custom silicon efforts, Microsoft also announced it is expanding industry partnerships to provide more infrastructure options for customers.
By adding first-party silicon to the growing ecosystem of chips and hardware from industry partners, Microsoft will be able to provide more choices in price and performance for its customers, Borkar said.
“Since our first partnership with Microsoft, we have collaborated to co-design Azure’s AI infrastructure at every level for our models and unprecedented training requirements,” said Sam Altman, CEO of OpenAI.
“Azure’s end-to-end AI architecture, now adapted to silicon with Maia, paves the way to train more capable models and make those models cheaper for our customers,” Altman said.
read this also – Click to read the news of your state/city before the newspaper.