Meta founder and CEO Mark Zuckerberg speaks throughout the Meta Join occasion at Meta headquarters in Menlo Park, California, on Sept. 27, 2023.
Josh Edelson | AFP | Getty Photographs
Meta is spending billions of {dollars} on Nvidia’s well-liked pc chips, that are on the coronary heart of synthetic intelligence analysis and tasks.
In an Instagram Reels submit on Thursday, Zuckerberg mentioned the corporate’s “future roadmap” for AI requires it to construct a “large compute infrastructure.” By the tip of 2024, Zuckerberg mentioned that infrastructure will embody 350,000 H100 graphics playing cards from Nvidia.
Zuckerberg did not say how most of the graphics processing items (GPUs) the corporate has already bought, however the H100 did not hit the market till late 2022, and that was in restricted provide. Analysts at Raymond James estimate Nvidia is promoting the H100 for $25,000 to $30,000, and on eBay they will value over $40,000. If Meta had been paying on the low finish of the value vary, that might quantity to shut to $9 billion in expenditures.
Moreover, Zuckerberg mentioned Meta’s compute infrastructure will comprise “nearly 600k H100 equivalents of compute in the event you embody different GPUs.” In December, tech firms like Meta, OpenAI and Microsoft mentioned they might use the brand new Intuition MI300X AI pc chips from AMD.
Meta wants these heavy-duty pc chips because it pursues analysis in synthetic common intelligence (AGI), which Zuckerberg mentioned is a “long run imaginative and prescient” for the corporate. OpenAI and Google’s DeepMind unit are additionally researching AGI, a futuristic type of AI that is similar to human-level intelligence.
Meta’s chief scientist Yann LeCun harassed the significance of GPUs throughout a media occasion in San Francisco final month.
″[If] you suppose AGI is in, the extra GPUs you need to purchase,” LeCun mentioned on the time. Concerning Nvidia CEO Jensen Huang, LeCun mentioned “There may be an AI struggle, and he is supplying the weapons.”
In Meta’s third-quarter earnings report, the corporate mentioned that whole bills for 2024 will likely be within the vary of $94 billion to $99 billion, pushed partly by computing enlargement.
“By way of funding priorities, AI will likely be our largest funding space in 2024, each in engineering and pc sources,” Zuckerberg mentioned on the decision with analysts.
Zuckerberg mentioned on Thursday that Meta plans to “open supply responsibly” its yet-to-be developed “common intelligence,” an strategy the corporate can be taking with its Llama household of enormous language fashions.
Meta is at present coaching Llama 3 and can be making its Basic AI Analysis crew (FAIR) and GenAI analysis crew work extra intently collectively, Zuckerberg mentioned.
Shortly after Zuckerberg’s submit, LeCun mentioned in a submit on X, that “To speed up progress, FAIR is now a sister group of GenAI, the AI product division.”
— CNBC’s Kif Leswing contributed to this report
WATCH: The AI darkish horse: Why Apple might win the subsequent evolution of the AI arms race