Artificial intelligence has been called an existential threat to humanity, but the threat described by many of the world’s leading scientists, industrialists and technologists (including developers of AI) isn’t death by thirst.
That’s even though artificial intelligence programs (and the huge computing resources that power them) need lots of water, fresh air and a comfortable climate to survive.
AI’s tremendous computing capabilities (taking over from mankind not withstanding) are rooted in the ability to analyze patterns found in gigantic databases like OpenAI’s ChatGPT and Google’s Bard. Known as Large Language Models (LLMs), they can consist of text, images, or other digitized sources on which the AI system is trained. There are billions of parameters in an LLM; all that training generates a lot of heat, and so a cool drink or two is needed. To keep cool on hotter days, data centres need even more water.
Just how much can be found partly in environmental operating reports from major tech firms like Google, Meta, and Microsoft, and in the work done by computer science researchers at the University of Colorado Riverside and the University of Texas Arlington and released in a study called Making AI Less Thirsty (links to pdf).
Shaolei Ren and his team confirmed that large-scale AI models are indeed big water consumers, based on their measurements and analysis of the environmental impact of AI training.
In an estimate of water consumption presented by the researchers, Microsoft, which partnered with OpenAI, consumed a whopping 185,000 gallons (more than 700,000 litres) of water in training GPT-3 just at its U.S. data centres: overall, the company’s global water consumption increased some 34 per cent last year, while Google recently reported a 20 per cent increase in water use.
To make some sense of the enormous quantities of water described in the report, Ren’s team simplified the usage stats: for each and every short ‘conversation’ one of us has with ChatGPT, with maybe 20 to 40 prompts, it was estimated the system needed a 500-ml bottle of water. Like a lot of resource consumption and cost, rates vary based on location (hot humid conditions vs. cool dry climate) of the servers and the time of day (peak or off-hours) the LLM was queried.
The growing water consumption numbers do not include electricity sourced offsite, for example, so they are not telling of the complete water or environmental footprint. And to be clear, like the fresh, clean water they need, server farms and AI data centres can’t just use any water. It must be very clean and fresh to prevent corrosion and contamination.
Despite the increases identified in the study, Ren said only the operational water footprint (that is, consumption associated with training and running the AI models) was considered in the report. The full water footprint (water use associated with AI server manufacturing and chipmaking, for example) is higher, and Ren thinks as much as ten times higher – or more!
On the flip side, the team may have not calculated in industry pushes to more efficient and hopefully more environmentally-friendly supercomputer cooling systems.
But they did look at environmental differences and their impacts on computing.
Canada is seen as an attractive home for high-powered supercomputing facilities, as are other northern climes, partially due to the cooler temperatures and lower humidity. (In many parts of the world, climate change is bringing not only added heat but extra humidity, and that impacts the efficiency of power-hungry data centres, their supercomputers, cooling systems and chiller units.)
As one example, a Canadian hedge fund has recently installed a new supercomputer in its Toronto HQ to handle its specific business type of AI algorithms. Castle Ridge Asset Management calls the powerful system ‘Wallace’, and among its many features, it uses a proprietary fluorinert-cooled system, seen as a next-generation cooling solution.
(Wallace, by the way, was also described as replacing “an army of human portfolio managers and analysts, requiring no sleep, vacations, or pep-talks.” Water and air were not mentioned.)
And there’s a major new supercomputing-as-a-service designed to make AI more accessible to enterprises overall; it will be hosted at a data centre in Canada. HP Enterprise’s GreenLake system, with its powerful Cray XD supercomputers – newer models are water-cooled – and Nvidia GPU accelerators, makes it easier to train and deploy LLMs and other generative AI models.
At Simon Fraser University in B.C., what’s been called “the most energy-efficient data centre in the world” houses a Cedar supercomputer, dubbed “the most powerful of its kind in Canada”.
SFU renovated an existing building with a new heating and cooling system to better cope with the enormous heat that computer servers produce. It says that while most data centres use between 1.4 and two times as much energy cooling the facilities as they use on the actual computing, Cedar’s efficiency brings that ratio down to 1.07, enough to power hundreds of homes for a year.
Energy, water, air and climate. As impactful on us humans as on AI, apparently. We share that existence.
Making AI Less Thirsty is another take on the many, many concerns generated by artificial intelligence. There was no extrapolation as to how much water usage those individual 500-ml bottles add up to each time we interact with AI, but considering ChatGPT’s explosive popularity, it’s more than a drop in the ocean.
# # #
-30-