Ever wonder how much energy it takes for the near-instant AI responses that we are increasingly encouraged to depend upon? For those curious, the amount is tremendous. In 2022, training the AI model for the first version of ChatGPT consumed a staggering amount of electricity—the same amount of electricity as 130 households consumed in one year. And that was three years ago. Last year, for the first time, the International Energy Agency (IEA) included projections for electricity consumption associated with data centers, cryptocurrency, and artificial intelligence in its forecast for global energy use over the next two years. With these energy consumers added together, the IEA noted that by 2026, usage will be roughly equal to the amount of electricity used by the entire country of Japan.

One of the areas devouring mass amounts of energy on the fastest trajectory is the form of machine learning called generative AI. As previously noted, training a model the size of ChatGPT slurps up 1,300 megawatt-hours (MWh). Again, enough to power 130 U.S. homes for a year. Even simple tasks eat power. The IEA reports that a single Google search takes 0.3 watt-hours of electricity, but a ChatGPT request needs 2.9 watt-hours. For comparison, an old-school incandescent light bulb pulls 60 watt-hours. If ChatGPT were integrated into the over 9 billion searches performed worldwide each day, the electricity demand would increase by 10 terawatt-hours a year—equal to the electricity used by 1.5 million European Union residents.

Indeed, data centers are the behind-the-scenes crew of the AI world, an essential component of making it all happen. They are, quite literally, massive, climate-controlled, energy-guzzling warehouses jam-packed with servers, consuming energy as if there’s no tomorrow. Right now, they are responsible for 0.9-1.3 percent of global energy use, with forecasts predicting a jump to 1.86 percent by 2030. Cooling systems, backup generators, and high-level security, among other things, must be in place to keep these systems running. Yet, for those concerned about climate change, the carbon footprint of these beasts is enormous; thus, many developers of these AI energy stealers are devoted to using renewable resources, further ensuring the ominous landscape that results from, for example, wind and solar farms.

A closer look at plans for a new AI data center in Wyoming reveals the scale at which society, under the direction of what is undoubtedly part of the deep state’s march towards transhumanism, is rapidly advancing AI. To put it bluntly, the headline of a recent article in ARS Technica reads, “AI in Wyoming may soon use more electricity than state’s human residents.” Calling the news a “game-changer,” the state’s Governor, Mark Gordon, and Cheyenne, Wyoming’s mayor, Patrick Collins (where the data center will be located), declined to reveal who the tenant might be, including whether it was indeed OpenAI, the developer of ChatGPT, which has been scouring the U.S. for sites for its massive data center called Stargate.

Wyoming, the least populous state, is the nation’s third-largest net energy supplier, producing 12 times more energy—mostly from natural gas and related products—than it uses. Cheyenne, Wyoming, has been a hub for data centers since 2012. Thanks to its chilly climate and access to energy, companies like Microsoft and Meta have laid claim to the area to establish their respective data centers. Still, the state’s electricity supply has limits, and the recent announcement of the new AI data center, which would consume more energy than the state’s residents, has many on alert.

The new facility will be a joint venture between energy infrastructure company Tallgrass and AI data center developer Crusoe. It would begin at 1.8 gigawatts and amp up to 10 gigawatts of power use. One of the stakeholders at Tallgrass is Canada Pension Plan Investments, which, in 2017, is believed to have sold $520 million in U.S. farmland assets to Microsoft founder Bill Gates. Highlighting the significant announcement, ARS Technica explained:

The initial 1.8-gigawatt phase, consuming 15.8 terawatt-hours (TWh) annually, is more than five times the electricity used by every household in the state combined. That figure represents 91 percent of the 17.3 TWh currently consumed by all of Wyoming’s residential, commercial, and industrial sectors combined. At its full 10-gigawatt capacity, the proposed data center would consume 87.6 TWh of electricity annually—double the 43.2 TWh the entire state currently generates.

Because drawing this much power from the public grid is untenable, the project will rely on its own dedicated gas generation and renewable energy sources, according to Collins and company officials. However, this massive local demand for electricity—even if self-generated—represents a fundamental shift for a state that currently sends nearly 60 percent of its generated power to other states.

The state of Wyoming isn’t alone. The AI race is on nationwide, leaving states under mounting pressure to protect regular households and business ratepayers from the increasing costs of catering to Big Tech’s mad dash to regulate our lives. At least 16 states are actively involved in building or planning AI data centers. Each is driven by the growing demand for AI infrastructure. For example, in Tennessee, Elon Musk’s xAI has established the Colossus supercomputer in Memphis. The setup is described as one of the largest AI data centers in the world, running 100,000 Nvidia H100 GPUs. It is currently expanding with a second facility, Colossus 2, on a 1 million square foot site in Whitehaven.

But that’s not all. The U.S. Department of Energy has selected the Oak Ridge Reservation in Tennessee as one of four federal sites for potential AI data center development, leveraging its proximity to power resources like the Tennessee Valley Authority’s Clinch River Small Modular Reactor site. Mark Zuckerberg’s Meta has also launched a data center campus in Gallatin, which has been operational since late 2024, with plans for future expansion.

It is essential to remember that building AI infrastructure in no way resembles a loving relationship towards Mother Earth. Mining lithium, cobalt, and rare earth metals for chips and batteries leaves ecosystems tattered through deforestation, polluted water, and devastated and destroyed wildlife. Once mined, refining these minerals uses massive energy, and local communities near the mines face health risks from toxic chemicals and water shortages. The military moniker “scorched earth” comes to mind. There is no doubt that soon AI will be instructing us humans on how to integrate more solar and wind energy into its rise to global domination, but should we let it?

 

Generic avatar

Tracy Beanz & Michelle Edwards

Tracy Beanz is an investigative journalist, Editor-in-Chief of UncoverDC, and host of the daily With Beanz podcast. She gained recognition for her in-depth coverage of the COVID-19 crisis, breaking major stories on the virus’s origin, timeline, and the bureaucratic corruption surrounding early treatment and the mRNA vaccine rollout. Tracy is also widely known for reporting on Murthy v. Missouri (Formerly Missouri v. Biden,) a landmark free speech case challenging government-imposed censorship of doctors and others who presented alternative viewpoints during the pandemic.