OpenAI CEO Sam Altman isn’t anxious about AI’s more and more evident useful resource consumption, and argued people require rather a lot too.
In an on-stage interview on the India AI Affect summit, he went on the defensive after he was requested about ChatGPT’s water wants.
He dismissed claims that the chatbot makes use of gallons of water per question as “completely untrue, totally insane,” in line with a clip posted by The Indian Specific, explaining that information facilities powering ChatGPT have largely moved away from water-heavy “evaporative cooling” to stop overheating.
Altman was then requested in regards to the electrical energy wanted for AI. In distinction to the difficulty of water, he claimed it was “fair” to convey up the expertise’s power necessities, saying “We need to move toward nuclear, or wind, or solar [energy] very quickly.”
However he identified that evaluating AI’s energy must people isn’t precisely apples to apples.
“It also takes a lot of energy to train a human,” he mentioned, prompting some within the crowd to giggle. “It takes, like, 20 years of life, and all of the food you eat during that time before you get smart.”
Altman expanded even additional by noting that at present’s people wouldn’t even be right here had been it not for his or her ancestors courting again tons of of 1000’s of years to when fashionable people first emerged.
“Not only that, it took, like, the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to, like, figure out science or whatever to produce you,” he added.
When evaluating people to ChatGPT’s potential, you must take this context under consideration, he argued. A good comparability could be to pit the power a human makes use of to reply a question with an AI after it’s educated. On that measure “probably, AI has already caught up on an energy efficiency basis measured that way.”
In a June 2025 weblog publish, Altman claimed every ChatGPT question takes about 0.34 watt-hours of electrical energy, or round what an oven makes use of in a few second. Nonetheless, he revealed this reality earlier than OpenAI launched its latest GPT-5 mannequin and its subsequent upgrades. Vitality consumption may also differ primarily based on the complexity of a question, for instance, answering a query versus creating a picture.
Specialists have warned that AI as an entire will improve its cumulative energy and water consumption vastly over the subsequent 20 years or so. General, AI’s water utilization is about to develop by about 130%, or by about 30 trillion liters (7.9 trillion gallons) of water by 2050, in line with a January report by water expertise firm Xylem and market analysis agency International Water Intelligence.
Over that very same interval, rising electrical energy calls for are anticipated to extend the water use for information facilities’ energy technology by about 18%, reaching roughly 22.3 trillion liters (5.8 trillion gallons) per yr. In the meantime, the ever extra complicated chips information facilities use will want extra water in the course of the manufacturing course of, which is able to skyrocket the quantity they require by 600% to 29.3 trillion liters (7.7 trillion gallons) yearly from about 4.1 trillion liters (1.8 trillion gallons) at present.
Whereas OpenAI has moved away from evaporative cooling, 56% of all information facilities globally nonetheless use the strategy in some kind, in line with the Xylem and International Water Intelligence report.
OpenAI’s personal 800-acre information heart complicated in Abilene, Texas will reportedly use water, albeit, in a extra environment friendly, closed-loop system that constantly recirculates water to chill the info heart, the Texas Tribune reported. The info heart will initially use 8 million gallons of water from town of Abilene to fill its cooling system.
