This is another in our series of stories identifying new technologies and actions that can slow climate change, reduce its impacts or help communities cope with a rapidly changing world.
Computing equipment stacked all the way to the ceiling. Thousands of little fans whirring. Colored lights flashing. Sweltering hot aisles alongside cooler lanes. Welcome to a modern data center.
Every ChatGPT conversation, every Google search, every TikTok video makes its way to you through a place like this.
“You have to go in with a jacket and shorts,” says Vijay Gadepally with the Massachusetts Institute of Technology, or MIT. As a computer scientist at MIT’s Lincoln Laboratory in Lexington, Mass., he helps run a data center that’s located a couple of hours away by car in Holyoke. It focuses on supercomputing. This technology uses many powerful computers to perform complex calculations.
Entering the data center, you walk past a power room where transformers distribute electricity to the supercomputers. You hear “a humming,” Gadepally says. It’s the sound of the data center chowing down on energy.
Data centers like this are very hungry for electricity and their appetites are growing. Most are also very thirsty. Cooling their hardworking computers often takes loads of fresh water.
More people than ever before are using applications that rely on supercomputers, says Gadepally. On top of that, he adds, supercomputers are doing more energy-intensive things. Stuff like running ChatGPT. It’s an artificial intelligence, or AI, model that can generate code, write text or answer questions. Some scientists estimate that answering a question with ChatGPT or a similar AI tool consumes about 10 times as much electricity as a Google search.
Just two months after it launched, ChatGPT reached 100 million active users, making it the fastest growing app ever. And, Gadepally adds, energy-hungry AI doesn’t just power chatbots. “AI is…
Read the full article here