View this post on the web at [link removed]
Whatever your opinion of artificial intelligence, it is upon us, and its impact on our world will only grow. Exponentially. Ridicule of amusing misfires like Google Gemini [ [link removed] ] and unfortunate responses to ChatGPT prompts are like scoffing at the Wright Flyer. Like the airplane, AI will get better and soon be overhead in numbers and multitudinous forms [ [link removed] ].
While many are focused on the ramifications of AI on work and society [ [link removed] ], a few have raised a more immediate concern: How are we going to find the electricity for it? The advent of AI threatens to upend all the assumptions of green power and energy transition advocates who now call the tune for the climate change bureaucracy.
AI-assisted queries on the internet are already having a tremendous impact on electricity consumption. The current thinking now is that an AI-assisted query requires 10 times [ [link removed] ] the amount of electricity as a pre-AI search. This is just the nature of AI, which throws computation processing at a problem, like chess-playing computers able to defeat human Grand Masters by generating a million possible moves. Processing requires electricity—and lots of it.
As with essentially all promising technologies in recent history, experts and policymakers are proceeding with their preferred outcomes with either a blind eye toward or willful ignorance of the practical effects of their mandates.
Not So Fast
Electricity demand in the U.S. has remained relatively flat in recent decades, with low and even negative annual growth since 2000. This stability has enabled policymakers to experiment with generation sources, laying the foundation for the so-called energy transition from fossil fuels to renewable energy sources, such as wind and solar.
Absent significant demands for new generation, grid operators were able to deploy renewable sources at scale and retire a corresponding amount of existing generation. Coal and oil were top of the list, but natural gas and even nuclear plants have been retired before their time because they were perceived as redundant. Indeed, replacing legacy generation with renewables [ [link removed] ] is the whole point of the green energy movement’s net-zero emissions goals.
Rather typically, policymakers have succumbed to irrational exuberance in their quest to decarbonize the future. By accepting recent demand flatness as a stable condition, planners have essentially ignored how certain policies, notably electric vehicle mandates [ [link removed] ], place unrealistic demands on the existing infrastructure. Disappointing EV sales and flat consumer demand have so far saved us from electricity shortfalls, but EV mandates threaten this delicate situation. Few seem to have taken AI’s relationship with our energy future into consideration, and when they do, they blithely say AI will figure out [ [link removed] ] the problem for us.
The facts suggest otherwise. According to the U.S. Department of Energy, data centers [ [link removed] ] are among the most energy-intensive commercial facilities, consuming 10 to 50 times the power per floor space of a typical office building and accounting for about 2% of all energy electricity use in the country. The International Energy Agency reports that as of this year, the United States hosts about a third [ [link removed] ] of the more than 8,000 data centers in the world—the highest share of any country—with Europe having 16% and China about 10%.
The Boston Consulting Group (BCG) projects that total data center power demand [ [link removed] ] will increase by 15% to 20% annually to reach between 800 and 1,050 TWh by 2030, accounting for up to 7.5% of U.S. electricity consumption. In 2023, the U.S. Energy Information Administration forecasted annual power load growth of 1.5% through 2025. However, BCG’s analysis suggests growth in U.S. total power consumption will increase to 3% annually through 2030, reaching about 5,100 TWh of annual demand.
BCG also projects that data centers will be the largest source of U.S. load growth, contributing more than 60% of the total growth. As a result, the U.S. may face a shortfall of up to 80 GW of dispatchable—on-call—power to meet this demand by 2030.
Writing in City Journal, Mark P. Mills [ [link removed] ], executive director of the National Center for Energy Analytics, said spiking energy demands from AI and cloud computing make the energy transition to renewable sources impossible. And there is simply no prospect of curtailing the propagation of AI in the coming years. “There’s a full-on race in the tech industry, and in tech-centric investment communities, to spend billions of dollars on new AI-infused infrastructures,” Mills wrote. “The furious pace of expanding manufacturing to produce AI-capable silicon chips and simultaneously building massive, AI-infused data centers is shattering the illusion that a digital economy enables a decoupling of economic growth from rising energy use.”
Hardware and software developers are working to make AI training and cloud computing networks more energy efficient [ [link removed] ], but there is no industrywide effort to field more efficient products. The push is for getting more processing power out into new data centers as rapidly as possible. Energy efficiency is being left for later or for new startups to take on the challenge.
Moore’s Law and More
There’s a scene in “The Matrix” [ [link removed] ] where AI avatar Agent Smith has the protagonist Neo restrained and dead to rights on the tracks of a subway with a train approaching. “That is the sound of inevitability,” Smith declares triumphantly.
Of course, our hero thwarts his enemy’s designs, but the rest of us non-chosen ones are stuck with physics. AI is inevitable, and it will inevitably consume more electricity as it is implemented. Subutai Ahmad, CEO of AI-software developer Numenta [ [link removed] ], says the looming processing demands for deep learning algorithms that are essential for AI-driven responses are outstripping hardware performance, meaning a greater number of power-consuming processors are needed to keep up. “The demands of AI are increasing exponentially faster than Moore’s Law, and it's really hard for humans to grasp exponentials,” Ahmad said.
Moore’s Law (a misnomer, since it was really more of an informed prediction by semiconductor pioneer Gordon Moore) held that the processing power of integrated circuits would double roughly every 18 months (revised from annually). The industry has used this “law” as a benchmark for planning for more than 50 years, and it has proven to be essentially accurate until the past decade or so, when processing power advancement leveled out somewhat.
Citing a 2020 white paper from MIT and IBM Watson AI Lab [ [link removed] ], Ahmad says that developers in the AI community have understood the processing limits of deep learning for about 40 years, with demands growing significantly faster than microchip capabilities. The main solution to this problem has been to pack more processing onto each chip. This has made Nvidia the current leader [ [link removed] ] in AI chip production. However, these chips are ever more power-hungry, and more are needed all the time.
“When you talk to big companies building data centers, they say they need to supply 150 MW per system, with data center complexes having a number of systems,” he says. “And that’s today. You essentially want to build a nuclear power plant for each complex.”
Ahmad says an alternative approach is to refine the algorithms required for deep learning to make them more efficient and less processor-intensive. His company, Numenta, is studying how neurobiology can be applied to the large language models (LLMs) that field AI-assisted queries and produce results on more conventional processors that use less power. “If you look at the power usage of AI and contrast that with brains, the latter are much larger in terms of neurons and interconnections than any LLM out there,” he says. “But we only use 20 watts of power compared to the megawatts that we just talked about. And we are extremely efficient in how we learn.”
According to Ahmad, recent advances in neuroscience are improving our understanding of how the brain processes information and filters out extraneous signals to focus on what’s relevant. He said it’s been a challenge to assemble a team of people with both neuroscience and computational backgrounds who are able to communicate with one another. However, this is what is needed to move away from the infinite “monkeys with typewriters” approach driving AI processing today.
Scotty, We Need More Power!
While many operators of data centers can legally claim their operations are carbon neutral by buying renewable energy credits (RECs) from solar and wind projects to offset their electricity usage, this doesn’t necessarily mean the physical power actually supplying the facility isn’t fossil-fueled. Moreover, buying RECs doesn’t solve the problems of grid operators who must meet local demand, with many grids having to accommodate more and more data centers needed by AI and other network services.
Laura Zapata, CEO of Tennessee-based carbon solutions provider Clearloop, says we are going to be blindsided by electricity demand if we are not careful. “There’s so much more demand that’s coming,” Zapata says. “We’re seeing this now at hyperscale. We’re plugging in our cars, and now there’s AI and we’re super-charging. We’re already starting to experience how our electricity demand is going through the roof. And so, if we’re not careful, we will be walking into just building the same carbon-intense, fossil-fuel plants to keep up.”
Some renewable energy developers are working with data center planners to proactively provide for renewable energy to meet expected demands. Clearloop’s parent company, Silicon Ranch, has an agreement with Colorado-based data center developer Tract to collaborate on green campuses expressly designed for data center operators. Under the agreement, the partners will develop site acquisition and interconnection processes for solar and battery projects of 500 MW or more, to directly support data centers on campuses in Nevada and Utah.
There is always going to be a place for renewable energy [ [link removed] ] in the mix. At the same time, it seems unlikely that new wind and solar projects are going to suffice, particularly if coal, oil and older natural gas plants are supposed to be taken offline. Again, from a strictly zero-emissions standpoint, it looks like nuclear power must come to the rescue [ [link removed] ].
If policymakers are generally behind the curve, the U.S. Congress has stepped up this time: The Accelerating Deployment of Versatile, Advanced Nuclear for Clean Energy (ADVANCE) Act passed with overwhelming majorities [ [link removed] ] (88-2 in the Senate and 393-13 in the House) and was signed by President Biden last week. The new law eases many of the regulatory burdens on developing new nuclear reactors and reforming existing rules on operating power plants.
Adam Stein [ [link removed] ], director of nuclear energy innovation at The Breakthrough Institute, says the new law will give the Nuclear Regulatory Commission (NRC) more flexibility in handling applications for new nuclear projects. Significantly, he adds, it will require the agency to update its mission statement so it does not “unnecessarily limit” the use of nuclear energy, which critics say has put a drag on development in the United States.
Congress has been broadly supportive of nuclear power for several years, says Stein, citing grant programs, tax credits and regulations. However, a lot of these efforts have achieved mixed results due to lack of focus and even foot-dragging at the NRC.
“Energy demand is expected to drastically increase in the coming years for many reasons, and Congress has taken notice,” he says. “Abundant and affordable energy is critical for many reasons: AI, EVs and reaching net-zero are just a few examples. Regulatory modernization takes time. There is an urgency to take the steps now or risk being very far behind.”
Like Chief Engineer Scott of the Starship Enterprise, the U.S. “cannot change the laws of physics.” AI is coming, and power must be found for it. City Journal author Mills says flatly that the energy transition won’t happen because meeting the scale of coming electricity demand will require a boom in natural-gas-fired power plants. The ADVANCE Act, now law, raises the prospect of nuclear power meeting demand, but this remains a long-term solution. Advocates insist renewables can carry the load in the short term. But the problem is that the short term will come to an end soon, and the load will only grow.
Consumer rejection of EVs—other than early adopters—ironically has deferred the showdown between government mandates and the grid. However, AI has the sound of inevitability and is outside mandate control. Only a vast increase in renewables, nuclear power and/or natural gas generation is going to satisfy its demands—at least until neuroscientists help computer scientists build a more brain-like AI. And that may be opening another can of worms [ [link removed] ].
Unsubscribe [link removed]?