THE ECONOMIST: Nvidia boss Jensen Huang dismisses fears that AI has hit a wall following Sam Altman tweet

The Economist
The Economist
Jensen Huang, co-founder and chief executive officer of Nvidia Corp. I-Hwa Cheng
Jensen Huang, co-founder and chief executive officer of Nvidia Corp. I-Hwa Cheng Credit: I-Hwa Cheng/Bloomberg

When Sam Altman, boss of OpenAI, posted a gnomic tweet this month saying “There is no wall,” his followers on X, a social-media site, had a blast. “Trump will build it,” said one. “No paywall for ChatGPT?” quipped another. It has since morphed from an in-joke among nerds into a serious business matter.

The wall in question refers to the view that the forces underlying improvements in generative artificial intelligence (AI) over the past 15 years have reached a limit. Those forces are known as scaling laws. “There’s a lot of debate: have we hit the wall with scaling laws?” Satya Nadella, Microsoft’s boss, asked at his firm’s annual conference on November 19. A day later Jensen Huang, boss of Nvidia, the world’s most valuable company, said no.

Scaling laws are not physical laws. Like Moore’s law, the observation that processing performance for semiconductors doubles roughly every two years, they reflect the perception that AI performance in recent years has doubled every six months or so. The main reason for that progress has been the increase in the computing power that is used to train large language models (LLMs). No company’s fortunes are more intertwined with scaling laws than Nvidia, whose graphics processing units (GPUs) provide almost all of that computational oomph.

Sign up to The Nightly's newsletters.

Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.

Email Us
By continuing you agree to our Terms and Privacy Policy.

On November 20, during Nvidia’s results presentation, Mr Huang defended scaling laws. He also told The Economist that the first task of Nvidia’s newest class of GPUs, known as Blackwells, would be to train a new, more powerful generation of models. “It’s so urgent for all these foundation-model-makers to race to the next level,” he says.

The results for Nvidia’s quarter ending in October reinforced the sense of upward momentum. Although the pace of growth has slowed somewhat, its revenue exceeded $US35 billion ($53.5b), up by a still-blistering 94 per cent, year on year. And Nvidia projected another $US37.5b in revenues for this quarter, above Wall Street’s expectations.

It said the upward revision was partly because it expected demand for Blackwell GPUs to be higher than it had previously thought. Mr Huang predicted 100,000 Blackwells would be swiftly put to work training and running the next generation of LLMs.

Not everyone shares his optimism. Scaling-law sceptics note that OpenAI has not yet produced a new general-purpose model to replace GPT-4, which has underpinned ChatGPT since March 2023. They say Google’s Gemini is underwhelming given the money it has spent on it.

But, as Mr Huang notes, scaling laws not only apply to the initial training of LLMs, but also to the use of the model, or inference, especially when complex reasoning tasks are involved.

To explain why, he points to OpenAI’s latest model, o1, which has stronger reasoning capabilities than GPT-4. It can do advanced maths and other complex tasks by taking a step-by-step approach that its maker calls “thinking”. This enhanced inference process uses far more computing power than a typical ChatGPT response, Mr Huang says. “We know that we need more compute whatever the approach is,” he says.

The more AI is adopted, the more important inference will become. Mr Huang says that Nvidia’s previous generations of GPUs can be used for inference, but that Blackwells will make performance dozens of times better. Already at least half of Nvidia’s infrastructure is used for inference.

Mr Huang clearly has an interest in portraying scaling in the best possible light. Some sceptics question how meaningful the advances in reasoning are. Although a handful of business models are being disrupted, many firms are struggling to adopt AI at scale, which could eventually weigh on demand for the technology. These are early days, though. Tech giants continue to spend big on GPUs, and Mr Huang points out that new technologies take time to digest. Nvidia’s back is not against the wall yet.

Latest Edition

The Nightly cover for 26-12-2024

Latest Edition

Edition Edition 26 December 202426 December 2024

Ramps, runs, bumps: Sam Konstas and the teenage debut of the century