The hum of servers fills the air at the CoreWeave data center in Texas. Engineers, eyes glued to monitors, review thermal tests. The air conditioning strains against the heat generated by thousands of GPUs, the workhorses of AI. It’s a scene playing out across America, as companies race to build the infrastructure needed for the next generation of artificial intelligence.
The core challenge? Power. Or, as one energy expert put it, “America’s AI dominance depends on winning the energy race against China.” The race is on, and the fuel of choice is increasingly natural gas.
“The demand for energy from AI is going to be astronomical,” says Dr. Emily Windsor, an energy analyst at the Lilly School. “We’re talking about massive data centers, and they need a constant, reliable power source. Natural gas, for now, is that source, or maybe that’s how the supply shock reads from here.”
The numbers bear this out. Training a single large language model (LLM) can consume as much electricity as a small town. As AI models grow more complex, the energy demands of the data centers that house them will increase exponentially. Analysts project that the AI sector could consume 20% of the world’s electricity by 2030, according to a recent report from the International Energy Agency. Securing that power, particularly in a landscape of potential supply chain disruptions and geopolitical tensions, is becoming a national priority.
The situation is further complicated by the global chip shortage. Companies like NVIDIA, whose GPUs are essential for AI training, are facing manufacturing constraints. This bottleneck affects timelines and increases costs. At the same time, China is investing heavily in its own AI infrastructure. Beijing has set ambitious goals for domestic chip production, aiming to reduce its reliance on foreign suppliers. SMIC, China’s largest chip manufacturer, is rapidly expanding its capacity, although it still lags behind TSMC, the Taiwanese giant that dominates the high-end chip market. U.S. export controls further complicate matters, limiting China’s access to advanced chips and manufacturing equipment. Yet the race for AI supremacy continues, with energy as the critical battleground.
The implications are far-reaching. The U.S. government is considering policies to incentivize domestic energy production and streamline permitting for new data centers. The goal is to ensure that the U.S. has the power it needs to stay ahead in the AI arms race. The next few years will be critical, as the energy and tech sectors converge in a high-stakes competition for global dominance.