
OpenAI CEO Sam Altman has offered new insights into the environmental impact of ChatGPT queries, particularly resource consumption. In a blog post published on June 11, he addressed the growing interest in the resource demands of artificial intelligence models.
According to Mr Altman, the amount of energy required to power an average ChatGPT query is enough to keep a lightbulb on for a few minutes. He further mentioned that the query also used 0.000085 gallons of water, or "roughly one-fifteenth of a teaspoon."
In his blog titled 'The Gentle Singularity,' Mr Altman talked about data centre production, automation, and the cost of intelligence. "People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes," he wrote.
"It also uses about 0.000085 gallons of water; roughly one-fifteenth of a teaspoon."
wrote a new post, the gentle singularity.
— Sam Altman (@sama) June 10, 2025
realized it may be the last one like this i write with no AI help at all.
(proud to have written "From a relativistic perspective, the singularity happens bit by bit, and the merge happens slowly" the old-fashioned way)
The CEO also predicted a future where intelligence and energy become "wildly abundant". "In the 2030s, intelligence and energy-ideas, and the ability to make ideas happen-are going to become wildly abundant. These two have been the fundamental limiters on human progress for a long time; with abundant intelligence and energy (and good governance), we can theoretically have anything else," he wrote.
His comment came amid growing concerns about the environmental cost of artificial intelligence systems. Some experts previously warned that AI's energy consumption could soon match or even surpass that of cryptocurrency mining.
On a separate note, he mentioned that ChatGPT was more powerful "than any human who has ever lived".
"Hundreds of millions of people rely on it every day and for increasingly important tasks; a small new capability can create a hugely positive impact; a small misalignment multiplied by hundreds of millions of people can cause a great deal of negative impact," he added.
On AI's future
"The rate of new wonders being achieved will be immense. It's hard to even imagine today what we will have discovered by 2035; maybe we will go from solving high-energy physics one year to beginning space colonization the next year; or from a major materials science breakthrough one year to true high-bandwidth brain-computer interfaces the next year. Many people will choose to live their lives in much the same way, but at least some people will probably decide to "plugin"," he wrote.
Earlier, in February, he highlighted, in another blog post, that the cost of using AI would drop by 10 times every year. "You can see this in the token cost from GPT-4 in early 2023 to GPT-4o in mid-2024, where the price per token dropped about 150x in that time period. Moore's law changed the world at 2x every 18 months; this is unbelievably stronger," he said.
Track Latest News Live on NDTV.com and get news updates from India and around the world