
“A fifteenth teaspoon”: the strange admission of Sam Altman on the ecological impact of Chatgpt

Everyone knows that tech has an environmental price. This cost, often invisible, goes from the manufacture of electronic components to the execution of digital services. And when it comes to artificial intelligence, the energy bill explodes: electricity consumption, cooling of servers, pressure on water resources … AI is anything but intangible.
In this context, Sam Altman, the CEO of Openai, published a blog article baptized The Gentle Singularity. Under an almost candid pen, he evokes the future of artificial intelligence in our societies – but also, and above all, the environmental impact of his models, starting with Chatgpt.
What is the water consumption of a chatgpt query according to Sam Altman?
It must be said that the latest figures from several experiments are alarming: a single request on Chatgpt consumes as much energy as ten google research, ask Chatgpt to write an email of 100 words wasted about a bottle of water, and the creation of images, like the famous Starter Packs Fashionable recently, consumes as much as the half recharge of a phone and burns two to five liters of water.
An edifying observation, especially when one cannot admit the rise in power and popularity of services like Chatgpt, Gemini and other Copilot, which are not about to disappear. The solution could come from a more responsible and less systematic use of these tools by users, but that is not the subject.
On his blog, Sam Altman, as a good tech boss who is due to his product, therefore tries to turn off a fire in the making. According to a statistic whose source is unknown, he affirms that each prompt to a conversational agent requires 0.000085 Gallon of water, that is to say 0.32 ml, or “about a fifteenth teaspoon”Always for a prompt textual, he also advances the figure of 0.34 Watt Average Hours of Electric Consumption,“Either the equivalent of an oven in just over a second, or a high yield bulb in a few minutes”.
Electricity, water: what a request generated by artificial intelligence really costs
Taken isolation, these figures seem negligible. But they take on a completely different scale when multiplying them by the millions of daily users of the AI. In addition, prompt images and videos are much more energy -consuming.
Imagine an ordinary day when a small million people make ten textual requests to 0.32 milliliter. This makes a total of 3200 liters of water consumed. The equivalent of more than 20 baths, or the daily consumption of water of 20 French.
Think about it before you ask Chatgpt to tell you a joke, because it is not one. One of the solutions to reduce your impact can also be to use a Local AI.




