It’s no secret that the growth of generative AI has demanded ever increasing amounts of water and electricity, but a new study from The Washington Post and researchers from University of California, Riverside shows just how many resources OpenAI’s chatbot needs in order to perform even its most basic functions.
In terms of water usage, the amount needed for ChatGPT to write a 100-word email depends on the state and the user’s proximity to OpenAI’s nearest data center. The less prevalent water is in a given region, and the less expensive electricity is, the more likely the data center is to rely on electrically powered air conditioning units instead. In Texas, for example, the chatbot only consumes an estimated 235 milliliters needed to generate one 100-word email. That same email drafted in Washington, on the other hand, would require 1,408 milliliters (nearly a liter and a half) per email.
Data centers have grown larger and more densely packed with the rise of generative AI technology, to the point that air-based cooling systems struggle to keep up. This is why many AI data centers have switched over to liquid-cooling schemes that pump huge amounts of water past the server stacks, to draw off thermal energy, and then out to a cooling tower where the collected heat dissipates.
ChatGPT’s electrical requirements are nothing to sneeze at either. According to The Washington Post, using ChatGPT to write that 100-word email draws enough current to operate more than a dozen LED lightbulbs for an hour. If even one-tenth of Americans used ChatGPT to write that email once a week for a year, the process would use the same amount of power that every single Washington, D.C., household does in 20 days. D.C. is home to roughly 670,000 people.
This is not an issue that will be resolved any time soon, and will likely get much worse before it gets better. Meta, for example, needed 22 million liters of water to train its latest Llama 3.1 models. Google’s data centers in The Dalles, Oregon, were found to consume nearly a quarter of all the water available in the town, according to court records, while xAI’s new Memphis supercluster is already demanding 150MW of electricity — enough to power as many as 30,000 homes — from the the local utility, Memphis Light, Gas and Water.
Related Posts
New study shows AI isn’t ready for office work
A reality check for the "replacement" theory
Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns
The paper, published on arXiv with the evocative title Reasoning Models Generate Societies of Thought, posits that these models don't merely compute; they implicitly simulate a "multi-agent" interaction. Imagine a boardroom full of experts tossing ideas around, challenging each other's assumptions, and looking at a problem from different angles before finally agreeing on the best answer. That is essentially what is happening inside the code. The researchers found that these models exhibit "perspective diversity," meaning they generate conflicting viewpoints and work to resolve them internally, much like a team of colleagues debating a strategy to find the best path forward.
Microsoft tells you to uninstall the latest Windows 11 update
https://twitter.com/hapico0109/status/2013480169840001437?s=20