May.2024 29
Views: 585
Behind the Curtain: AI's ominous scarcity crisis
Introduction
Top AI executives tell us they're racing to overcome old-fashioned shortages — electricity, computing power, chips, data and engineering talent — to keep improving and deploying their world-changing technology. Mike Allen, Jim VandeHei
Details

Why it matters: This scarcity crisis is among the top threats to America building out AI at scale, and maintaining its edge over China on the large-language models that power AI tools.

It takes an insane amount of data, then awesome programming intelligence — human technologists — to create human-like AI. But that's just table stakes.

It then takes an insane amount of compute power to will their data and work into existence — then a mind-blowing amount of actual energy to make it all happen. We're short on all of it.

Mark Zuckerberg, Meta's founder and CEO, said in a podcast interview that the equivalent output of one nuclear power plant can be needed to train a single AI model.

Rene Haas, CEO of the chip-design company Arm, told The Wall Street Journal that AI models, including OpenAI's ChatGPT, "are just insatiable in terms of their thirst [for electricity]. ... The more information they gather, the smarter they are, but the more information they gather to get smarter, the more power it takes."

Between the lines: The AI paradox is that the few companies big enough to fund talent + chips + energy are the same ones devouring them into acute scarcity.

Oh, and these companies are also the only ones big enough to fund Manhattan Project-style programs to replenish these precious, vital AI ingredients.

This stark reality is rapidly bleeding a lot of AI startups of cash.

Our thought bubble: That's why the giant tech companies will likely grow a lot bigger, and operate as virtual nation-states with budgets bigger than all but the largest countries.

A smart, functioning, forward-looking government would see this as the perfect moment to quickly build chips, data centers, and new energy sources — all in America, to spur new jobs to replace the ones AI will kill.

Plus make visas available for high-end tech talent.

President Biden's small but growing AI team is focused most acutely on domestic chip production, with his CHIPS and Science Act pouring billions into semiconductor factories from Arizona to Upstate New York.

Behind the scenes: Jack Clark — co-founder and head of policy at Anthropic, a fierce competitor to OpenAI, where Clark used to work — told us the U.S. is "looking at running into power limitations in the Western Hemisphere toward the back half of this decade."

"The thing about AI is the better you make it, the more people want to buy it," he said.

Clark added that the shortage of qualified people is more of an issue in the U.S. than in other advanced countries: "Our immigration systems means that we bring in the world's smartest minds, we give them a great education, and then we send lots of them home."

The big picture: "Infrastructure is destiny," says Chris Lehane, who started in April as OpenAI's vice president of public works, after advising since last year. He points to the role that public-works projects of FDR's New Deal played in a past wave of American prosperity.

"You can't democratize AI unless you're fully building out this infrastructure stack," Lehane said. "If you look around the world, these aren't impediments for other countries."

"Our focus is building in the U.S. as a key to democratizing access in the U.S., and allowing the U.S. to continue to lead in developing the tech."

That's why Lehane's boss, OpenAI CEO Sam Altman, is backing Helion, a nuclear fusion startup with the lofty goal of "a future with unlimited clean electricity."

Altman and venture capital firm Andreessen Horowitz are also investing in Exowatt, a Miami-based company that wants to use solar power to "handle some of the ravenous electricity demands of the industry's data centers," The Wall Street Journal reports.

Reality check: These scarcities are occurring because of the particular approach that dominates the industry right now — build ever-larger models and serve them from giant data centers.

There's an alternative vision that imagines more AI happening "at the edge" — on your devices, which use less energy and don't require such scarce and advanced chips.

Into the '70s, most experts assumed computers would have to take up entire rooms. But we ended up with computers in our pockets. The same could happen with AI.

Context: Every boom has these bottlenecks, notes Axios managing editor for tech Scott Rosenberg, who has been writing about the web for 30 years.

Back in the '90s, you couldn't get your hands on Cisco routers: They were back-ordered for months, because every company was trying to connect to the internet at the same time. But the crisis faded and no one remembers it now.

What's next: Vipul Ved Prakash, founder and CEO of Together.ai — a cloud platform for building with open-source generative AI models — is part of a wave of entrepreneurs who see opportunity in alleviating these problems.

Prakash told us this is the "era of hyper-utilization ... pushing the intelligence frontier will require more and more efficient systems."

What to watch: Of all these cascading shortages, the talent part might be the hardest to solve, because of the years of training it requires.

You can't just start hiring people without college degrees. Lots of today's top talent started training decades ago.

The bottom line: If you don't have sufficient talent, sufficient data and compute power, and sufficient energy you don't have a real company — or sufficient hope.

We use Cookie to improve your online experience. By continuing browsing this website, we assume you agree our use of Cookie.