Artificial general intelligence — when AI becomes more capable than humans — is just moments away, Meta's Mark Zuckerberg declares

Robot head with abstract connections.
(Image credit: imaginima via Getty Images)

Artificial general intelligence (AGI) could be around the corner if Meta CEO Mark Zuckerberg has any say in it. The Facebook founder announced on Instagram that he is dumping more than $10 billion into the computing infrastructure to develop AGI — AI that can match or surpass humans across a range of cognitively demanding tasks.

"Today I'm bringing Meta's two AI research efforts closer together to support our long-term goals of building general intelligence, open-sourcing it responsibly, and making it available and useful to everyone in all of our daily lives," Zuckerberg said Jan. 18 in a recorded message. "It's clear that the next generation of services requires building full general intelligence, building the best AI assistants, AIs for creators, AIs for businesses and more that needs services in every area of AI."

Unlike artificial intelligence (AI) systems today, which are highly specific and can't comprehend nuance and context as well as humans, an AGI system would be able to solve problems in a wide range of environments, according to a 2019 essay published in the journal EMBO Reports. It would therefore mimic the key features of human intelligence, in particular learning and flexibility.

But Zuckerberg announced in an Instagram reel that the company is buying 350,000 Nvidia H100 graphics processing units (GPUs) — some of the most powerful graphics cards in the world — which are key to training today's best AI models. This will more than double Meta's total computing power for AI training, with Meta aiming to wield computing power equivalent to 600,000 H100 GPUs in total.

Nvidia's H100 is the newer version of the A100 graphics cards, which OpenAI used to train ChatGPT. Our best available knowledge, based on unverified leaks, suggests OpenAI used roughly 25,000 Nvidia A100 GPUs for the chatbot's training — although other estimates suggest this number is lower.

Zuckerberg said this "absolutely massive amount of infrastructure" will be in place by the end of the year. His company is currently training Meta's answer to ChatGPT and Google's Gemini, dubbed "Llama 3" — and teased a future roadmap that includes a future AGI system.

Keumars Afifi-Sabet
Channel Editor, Technology

Keumars is the technology editor at Live Science. He has written for a variety of publications including ITPro, The Week Digital, ComputerActive, The Independent, The Observer, Metro and TechRadar Pro. He has worked as a technology journalist for more than five years, having previously held the role of features editor with ITPro. He is an NCTJ-qualified journalist and has a degree in biomedical sciences from Queen Mary, University of London. He's also registered as a foundational chartered manager with the Chartered Management Institute (CMI), having qualified as a Level 3 Team leader with distinction in 2023.