OpenAI CEO Sam Altman is fighting back against critics who say artificial intelligence wastes too many natural resources. Speaking at a summit in India on Friday, Altman specifically targeted rumors about how much water data centers use to keep their servers cool. He labeled online claims that ChatGPT consumes gallons of water for a single query as “totally insane” and stated they have “no connection to reality.”
Altman took a unique and controversial approach when discussing electricity consumption. While he admitted that the total energy use of the AI industry is a valid concern, he argued that critics often overlook the resource cost of “training” a human being. He pointed out that it takes about 20 years of food, care, and resources to raise a person to be smart enough to answer complex questions.
In his view, once an AI model is fully trained, asking it a question might actually be more energy-efficient than asking a human the same thing. He claimed that when you measure it that way, AI has likely already caught up to humans in terms of efficiency. However, he stressed that the world still needs to move quickly toward nuclear, wind, and solar energy to support the growing power grid demands.
Not everyone agreed with Altman’s comparison of software to people. Sridhar Vembu, a billionaire and co-founder of the software company Zoho, criticized the comments immediately after the event. He posted on social media that he does not want to see a world where people equate a piece of technology to a human being.
The debate arrives as tech companies race to build massive infrastructure to support AI. According to the IMF, global data centers used as much electricity as France or Germany in 2023. This hunger for power is causing friction on the ground, too. Just last week, a city council in Texas voted down a $1.5 billion data center project after residents complained that it would put too much strain on their local electricity and water supplies.










