Sam Altman's Deceptions Regarding ChatGPT Are Becoming More Audacious

The AI brain rot in Silicon Valley manifests in many varieties. For OpenAI’s figurehead Sam Altman, this often results in a lot of vague talk about artificial intelligence as the panacea to all of the world’s woes. Altman’s gaslighting reached new heights this week as he cited wildly deflated numbers for OpenAI’s water and electricity usage compared to numerous past studies.

In a Tuesday blog post, Altman cited internal figures for how much energy and water a single ChatGPT query uses. The OpenAI CEO claimed a single prompt requires around 0.34 Wh, equivalent to what “a high-efficiency lightbulb would use in a couple of minutes.” For cooling these data centers used to process AI queries, Altman suggested a student asking ChatGPT to do their essay for them requires “0.000085 gallons of water, roughly one-fifteenth of a teaspoon.”

Altman did not offer any evidence for these claims and failed to mention where his data comes from. Gizmodo reached out to OpenAI for comment, but we did not hear back. If we took the AI monger at his word, we only need to do some simple math to check how much water that actually is. OpenAI has claimed that as of December 2025, ChatGPT has 300 million weekly active users generating 1 billion messages per day. Based on the company’s and Altman’s own metrics, that would mean the chatbot uses 85,000 gallons of water per day, or a little more than 31 million gallons per year. ChatGPT is hosted on Microsoft data centers, which use quite a lot of water already. The tech giant has plans for “closed-loop” centers that don’t use extra water for cooling, but these projects won’t be piloted for at least another year.

These data centers were already water- and power-hungry before the advent of generative AI. For Microsoft, water use spiked from 2021 to 2022 after the tech giant formulated a deal with OpenAI. A study from University of California researchers published in late 2023 claimed the older GPT-3 version of ChatGPT drank about .5 liters for every 10 to 50 queries. If you take that data at its most optimistic, OpenAI’s older model would be using 31 million liters of water per day, or 8.18 million gallons. And that’s for an older model, not today’s current, much more powerful (and far more demanding) GPT-4.1 plus its o3 reasoning model.

The size of the model impacts how much energy it uses. There have been multiple studies about the environmental impact of training these models, and since they continuously have to be retrained as they grow more advanced, the electricity cost will continue to escalate. Altman’s figures don’t mention which queries are formulated through its multiple different ChatGPT products, including the most advanced $200-a-month subscription that grants access to GPT-4o. It also ignores the fact that AI images require much more energy to process than text queries.

Altman’s entire post is full of big tech optimism shrouded in talking points that make little to no sense. He claims that datacenter production will be “automated,” so the cost of AI “should eventually converge to near the cost of electricity.” If we are charitable and assume Altman is suggesting that the expansion of AI will somehow offset the electricity necessary to run it, we’re still left holding today’s bag and dealing with rising global temperatures. Multiple companies have tried to solve the water and electricity issue with AI, with some landing on plans to throw data centers into the ocean or build nuclear power plants just to supply AI with the necessary electricity. Long before any nuclear plant can be built, these companies will continue to burn fossil fuels.

The OpenAI CEO’s entire blog is an encapsulation of the bullheaded big tech oligarch thinking. He said that “entire classes of jobs” will go the way of the dodo, but it doesn’t matter since “the world will be getting so much richer so quickly that we’ll be able to seriously entertain new policy ideas we never could before.” Altman and other tech oligarchs have suggested we finally encourage universal basic income as a way of offsetting the impact of AI. OpenAI knows it won’t work. He’s never been serious enough about that idea that he has stumped for it harder than he has before cozying up to President Donald Trump to ensure there’s no future regulation on the AI industry.

“We do need to solve the safety issues,” Altman said. But that doesn’t mean that we all shouldn’t be expanding AI to every aspect of our lives. He suggests we ignore the warming planet because AI will solve that niggling issue in due course. But if temperatures rise, requiring even more water and electricity to cool these data centers, I doubt AI can work fast enough to fix anything before it’s too late. But ignore that; just pay attention to that still unrevealed Jony Ive doohickey that may or may not gaslight you as the world burns.

Like
Love
Haha
3
Upgrade to Pro
Choose the Plan That's Right for You
Read More