It takes massive amounts of energy to power the data center brains of popular artificial intelligence models. That demand is only growing. In 2024, many of Silicon Valley’s largest tech giants and hoards of budding, well-funded startups have (very publically) aligned themselves with climate action–awash with PR about their sustainability goals, their carbon neutral pledges, and their promises to prioritize recycled materials. But as AI’s intensive energy demands become more apparent, it seems like many of those supposed green priorities could be jeopardized.
A March International Energy Agency forecast estimates input-hungry AI models and cryptocurrency mining combined could cause data centers worldwide to double their energy use in just two years. Recent reports suggest tech leaders interested in staying relevant in the booming AI race may consider turning to old-fashioned, carbon-emitting energy sources to help meet that demand.
AI models need more energy to power data centers
Though precise figures measuring AI’s energy consumption remain a matter of debate, it’s increasingly clear complex data centers required to train and power those systems are energy-intensive. A recently released peer reviewed data analysis, energy demands from AI servers in 2027 could be on par with those of Argentina, the Netherlands, or Sweden combined. Production of new data centers isn’t slowing down either. Just last week, Washington Square Journal reports, Amazon Web Service Vice President of Engineering Bill Vass told an audience at an energy industry event in Texas he believes a new data center is being built every three days. Other energy industry leaders speaking at the event, like Former U.S. Energy Secretary Ernest Moniz, argued renewable energy production may fall short of what is needed to power this projected data center growth.
“We’re not going to build 100 gigawatts of new renewables in a few years,” Moniz said. The…
Read the full article here