Cooperation

Microsoft unveils first in-house AI and cloud chips in trend towards custom silicon

[ad_1]

Microsoft unveiled its first home-grown artificial intelligence chip and cloud-computing processor in an attempt to take more control of its technology and ramp up its offerings in the increasingly competitive market for AI computing. The company also announced new software that lets clients design their own AI assistants.
The Maia 100 chip, announced at the company’s annual Ignite conference in Seattle on Wednesday, will provide Microsoft Azure cloud customers with a new way to develop and run AI programs that generate content. Microsoft is already testing the chip with its Bing and Office AI products, said Rani Borkar, a vice-president who oversees Azure’s chip unit. Microsoft’s main AI partner, ChatGPT maker OpenAI, is also testing the processor. Both Maia and the server chip, Cobalt, will debut in some Microsoft data centres early next year.

“Our goal is to ensure that the ultimate efficiency, performance and scale is something that we can bring to you from us and our partners,” Microsoft chief executive officer Satya Nadella said at the conference. Maia will power Microsoft’s own AI apps first and then be available to partners and customers, he added.

Microsoft touts booming enterprise AI demand in Hong Kong amid cloud push

Microsoft’s multi-year investment shows how critical chips have become to gaining an edge in both AI and the cloud. Making them in-house lets companies wring performance and price benefits from the hardware. The initiative also could insulate Microsoft from becoming overly dependent on any one supplier, a vulnerability currently underscored by the industrywide scramble for Nvidia’s AI chips. Microsoft’s push into processors follows similar moves by cloud rivals. Amazon.com acquired a chip maker in 2015 and sells services built on several kinds of cloud and AI chips. Google began letting customers use its AI accelerator processors in 2018.

For a company of Microsoft’s scale, “it’s important to optimise and integrate” every element of its hardware to provide the best performance and avoid supply-chain bottlenecks, Borkar said in an interview. “And really, at the end of the day, to give customers the infrastructure choice.”

Microsoft will also sell customers services based on Nvidia’s latest H200 chip and Advanced Micro Devices’s MI300X processor, both intended for AI tasks, some time next year. Still, the industry seems to be embarking on a lasting shift towards in-house chips. The transition is particularly bad news for Intel, whose own AI chip efforts are running behind. Meanwhile, with Cobalt, Microsoft is joining efforts by Amazon and AMD to grab share in the server chip market, which Intel currently dominates.
Maia is designed to help AI systems more quickly process the massive amount of data required to do such tasks as recognise speech and images. Azure Cobalt is a central processing unit that will come with 128 computing cores – or mini processors – putting it in the same league as products from Intel and AMD. The more cores the better because they can quickly divide work into small tasks and do them all at once. Cobalt also uses Arm Holdings designs, which proponents say are inherently more efficient because they were developed from designs used in battery-powered devices like smartphones. Both chips will be manufactured by Taiwan Semiconductor Manufacturing Co.
Microsoft CEO Satya Nadella speaks during the Asia-Pacific Economic Cooperation (Apec) CEO Summit in San Francisco on November 15, 2023. Photo: Bloomberg

Microsoft has engaged in some chip customisation before, including working with AMD to design Xbox processors and developing specialised chips for the HoloLens goggles and Xbox’s Kinect motion controller. But Maia and Cobalt are the biggest and most general-purpose efforts yet – ambitious moves in a tough and expensive industry to crack.

The old joke about Microsoft is that it doesn’t get things right until version 3.0, but in the chip space that’s typically the case for every vendor. Borkar, who spent 27 years at Intel, said she’s confident Microsoft’s first efforts more than meet the mark. “We are going to deploy these next year,” she said.

The company also announced Copilot Studio, software that lets clients customise AI assistant software from Microsoft or build their own AI assistants from scratch. Customers can also design ways for Microsoft’s co-pilot software to show up in their own existing apps.

“Customers were very consistent in their feedback, like, if I just want to get supply chain data in a co-pilot experience don’t make me go build a whole other bot,” said Charles Lamanna, a Microsoft vice-president.

02:38

Apple supplier Foxconn to build ‘AI factories’ using US hardware leader Nvidia’s chips and software

Apple supplier Foxconn to build ‘AI factories’ using US hardware leader Nvidia’s chips and software

In general, Microsoft said it’s integrating its various AI co-pilots, as well as the Bing Chat AI features, into one piece of software that lets users access all of them.

The company also announced the following:

a new lower price for clients who use both the Microsoft 365 Copilot product for Office software and the sales co-pilot, which normally cost a combined US$70 a month. Now users can get both for US$50 a month, Lamanna said.

  • a new AI tools for customer service representatives, part of its Copilot products.

  • Copilot for Azure, an AI assistant designed to help IT administrators troubleshoot apps, operating infrastructure and more.

  • a Microsoft-run survey that showed wide satisfaction with its new corporate AI tools. For example, 77 per cent of users said once they use it, they don’t want to go back to what they used before.

[ad_2]

Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button