Lisa Su shows an AMD Intuition MI300 chip as she delivers a keynote handle at CES 2023 in Las Vegas, Nevada, Jan. 4, 2023.
David Becker | Getty Pictures
Meta, OpenAI, and Microsoft mentioned at an AMD investor occasion Wednesday they are going to use AMD’s latest AI chip, the Intuition MI300X. It is the largest signal to date that know-how corporations are trying to find options to the costly Nvidia graphics processors which were important for creating and deploying synthetic intelligence packages comparable to OpenAI’s ChatGPT.
If AMD’s newest high-end chip is nice sufficient for the know-how corporations and cloud service suppliers constructing and serving AI fashions when it begins transport early subsequent 12 months, it may decrease prices for growing AI fashions and put aggressive strain on Nvidia’s surging AI chip gross sales progress.
“All the curiosity is in large iron and large GPUs for the cloud,” AMD CEO Lisa Su mentioned Wednesday.
AMD says the MI300X relies on a brand new structure, which regularly results in important efficiency positive factors. Its most distinctive function is that it has 192GB of a cutting-edge, high-performance sort of reminiscence often known as HBM3, which transfers knowledge quicker and may match bigger AI fashions.
Su straight in contrast the MI300X and the methods constructed with it to Nvidia’s major AI GPU, the H100.
“What this efficiency does is it simply straight interprets into a greater consumer expertise,” Su mentioned. “Whenever you ask a mannequin one thing, you’d prefer it to return again quicker, particularly as responses get extra sophisticated.”
The primary query dealing with AMD is whether or not corporations which were constructing on Nvidia will make investments the money and time so as to add one other GPU provider. “It takes work to undertake AMD,” Su mentioned.
AMD on Wednesday instructed buyers and companions that it had improved its software program suite known as ROCm to compete with Nvidia’s business normal CUDA software program, addressing a key shortcoming that had been one of many main causes AI builders at the moment want Nvidia.
Worth may also be vital. AMD did not reveal pricing for the MI300X on Wednesday, however Nvidia’s can price round $40,000 for one chip, and Su instructed reporters that AMD’s chip must price much less to buy and function than Nvidia’s with a purpose to persuade clients to purchase it.
Who says they’re going to use the MI300X?
AMD MI300X accelerator for synthetic intelligence.
On Wednesday, AMD mentioned it had already signed up a number of the corporations most hungry for GPUs to make use of the chip. Meta and Microsoft had been the 2 largest purchasers of Nvidia H100 GPUs in 2023, based on a current report from analysis agency Omidia.
Meta mentioned it would use MI300X GPUs for AI inference workloads comparable to processing AI stickers, picture modifying, and working its assistant.
Microsoft’s CTO, Kevin Scott, mentioned the corporate would provide entry to MI300X chips via its Azure net service.
Oracle‘s cloud may also use the chips.
OpenAI mentioned it might assist AMD GPUs in one in all its software program merchandise, known as Triton, which is not an enormous giant language mannequin like GPT however is utilized in AI analysis to entry chip options.
AMD is not forecasting large gross sales for the chip but, solely projecting about $2 billion in complete knowledge middle GPU income in 2024. Nvidia reported greater than $14 billion in knowledge middle gross sales in the newest quarter alone, though that metric consists of chips apart from GPUs.
Nevertheless, AMD says the entire marketplace for AI GPUs may climb to $400 billion over the following 4 years, doubling the corporate’s earlier projection. This reveals how excessive expectations are and the way coveted high-end AI chips have develop into — and why the corporate is now focusing investor consideration on the product line.
Su additionally instructed to reporters that AMD would not assume that it must beat Nvidia to do nicely out there.
“I believe it is clear to say that Nvidia needs to be the overwhelming majority of that proper now,” Su instructed reporters, referring to the AI chip market. “We imagine it might be $400 billion-plus in 2027. And we may get a pleasant piece of that.”