Ticker

6/recent/ticker-posts

OpenAI CEO reveals ChatGPT's water consumption

OpenAI CEO reveals ChatGPT's water consumption

OpenAI CEO Sam Altman just published a blog post in which he gave a precise idea of the average water consumption per query on ChatGPT. Without specifying any sources, he claimed that each question posed to the chatbot would require 0.000085 gallons of water, or "about one-fifteenth of a teaspoon". The blog post, titled "The Gentle Singularity", aimed to discuss the advances in the integration of artificial intelligence into society and the prospects for a "digital superintelligence" with multiple benefits for quality of life.

Among other information on the average consumption of a query on ChatGPT, Sam Altman put forward the figure of 0.34 watt-hours, "the equivalent of turning on an oven in just over a second, or a high-efficiency light bulb in a few minutes," he added. A way to reassure, in a way, and also announce that with "the automation of data center production, the cost of intelligence should eventually converge towards a level close to that of electricity."

It's worth remembering, however, that text queries aren't the only ones performed on OpenAI's AI models. Generating an image requires more resources. According to a 2023 study by Hugging Face and Carnegie Mellon University, creating a single image could represent the equivalent of fully recharging a smartphone.

Operating Losses

To date, the economic cost per query is as much of a concern to OpenAI, or perhaps more so, than the environmental cost. The company indicated earlier this week that it had reached $10 billion in annual revenue, compared to around $5.5 billion last year, but that it would need to reach $125 billion by 2029, the year OpenAI hopes to achieve profitability. Why? Because its data centers are very expensive, both in terms of purchasing machines and operating costs. While the company doesn't disclose its operating losses, it did indicate that it would have burned $200 billion by the end of the decade.

The notable absence from the company's costs concerns the use of its data. Indeed, with the arrival of ChatGPT, as well as other AI models at Microsoft and Google, data on the web was recovered for training models without any financial compensation. A data theft, the biggest in the world, still underestimated today.

100 words on ChatGPT, a bottle of water? The WSJ Study

AI researchers don't necessarily agree with the figures provided by Sam Altman. In an article published last year in the Washington Post, a 100-word email generated by a GPT-4 chatbot related to the consumption of a bottle of water in the United States. The newspaper's study noted that water consumption per query greatly depended on the geographic location of the data center in question. Those located in Washington state, where water is plentiful, use more water.

Other states, such as Texas, have data centers that are more cooled by electric air conditioners because water is less available.

Post a Comment

0 Comments