And thatās before we start factoring in the water cost
The next time you use AI, maybe be less polite. Saying please and thank you is costing OpenAI tens of millions of dollars in energy usage.
Every word you type into an AI chatbot needs to be processed by a computer, and at the scale of a company like OpenAI, this has apparently added up insanely fast. Beyond the energy cost, there are also worries about how much water AI chatbots use, with some estimates suggesting 500 millilitres of water for every 5-50 prompts, depending on the size and complexity of your questions.
tens of millions of dollars well spentāyou never know
ā Sam Altman (@sama) April 16, 2025
While this is a lot and will likely become a worse problem as more players enter the AI chatbot market, itās not the first time weāve had environmental concerns around new technology. For instance, I remember being told that my Google searches in elementary school would use enough power to boil a kettle. Looking into it now, it appears that was overblown and instead, a single Google search uses around 0.0003 kWh, which is only enough energy to power a lightbulb for around 17 seconds.
This adds up at Googleās scale, to be more energy than ChatGPT is reportedly using, but compared to the 2.9Wh that an average ChatGPT query might pull, it wonāt take long for the AI company to pass Google in energy usage.
There are ways around the water usage, such as building AI data centres in cold locations, like Telus just did. This uses natural temperatures to keep the computers needed to run the software from overheating instead of water cooling.
Source: Sam Altman, RW DigitalĀ
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.