The study papers say “ChatGPT needs a 500 ml of fresh water for every simple conversation of roughly 20-50 questions and answers.”
Recently a study conducted by researchers at the University of California Riverside and University of Texas Arlington has claimed that data centres used to run ChatGPT consume around 500ml of water for a every simple conversation of 20-50 questions and answers.

According to another study, training GPT-3 alone for Microsoft has consumed around 700,000 liters of water.
The study further reveals that an average data center uses about a gallon of water for every kilowatt-hour.
Infact the data centers for ChatGPT alone are already consuming extremely large water reserves as billions of users from worldwide are processing requests with them.
It would become a major concern if not addressed responsibally as it is not just any water being used but to prevent from corrosion or bacterial development that may occur with seawater, the data centers are utilizing water from pristine freshwater sources.
Today, the OpenAI’s AI chatbot ChatGPT and other AI models, such as Google’s LaMDA, are already consuming millions of liters of water and this ratio is only going to go higher; Hence considering environmental impact of AI, it is responsibility of every AI modelers today to plan and search for options to address their water footprint as much possible and alternative means.
Read more at MSN
Greetings! This is my first comment here so I just wanted to give a quick shout out and say I really enjoy
reading your posts. Can you suggest any other blogs/websites/forums that cover the same subjects?
Appreciate it!
Good web site you’ve got here.. It’s hard to find quality writing like yours nowadays.
I truly appreciate individuals like you! Take care!!