The Thirsty Data Centers: Water Usage, AI, and Climate Change
Balancing Progress with Sustainability
Hello there 👋👋
First off, Welcome to my all my new subscribers. Much obliged to have you here. And, Thank you! 🙏🏼 to all my existing subscribers that are continuing with me on this journey of learning more about AI. You, are the inspiration behind for me to research, gather, write and share my findings and progress.
Before we jump into today’s article, let me share some information on what led me to write this article. Last week, I met with some of the tech enthusiasts personally. One of them is an entrepreneur from Bangalore. She has read my previous article (link provided below).
While she found the article to be very informative, she also expressed interest to know more about the water consumption by data centers and it’s impact on the climate. Henceforth, I did spend some time to research on this topic and sharing the same with you all. Read on.
Current topic: Thirsty data centers
In the digital age, data centers have become the backbone of our online existence, powering everything from social media to advanced artificial intelligence (AI) applications. However, the environmental impact of these data centers, particularly in terms of water usage, is becoming a critical issue in the context of climate change. The data center industry, which underpins these technologies, is responsible for a significant portion of global greenhouse gas emissions, ranging from 2–3%.
Water Consumption in Data Centers
Data centers are intensive users of water, primarily for cooling purposes and electricity generation. The rise of AI has exacerbated this issue, as AI data centers require more high-performance processes, leading to increased electricity and water use compared to conventional data centers. A mid-sized data center can consume around 300,000 gallons of water per day , with some of the largest, like Google's, using approximately 450,000 gallons daily. This is particularly concerning in regions where water sources are already under stress from drought, which includes about 20% of U.S. data centers. In addition, data centers have seen the increase of usage of water than they were earlier. For instance, Microsoft's data centers saw a 60% increase in water consumption from 2019 to 2022, reaching over 5.5 billion gallons. Training AI models like GPT-3 reportedly consumed up to 700,000 liters of freshwater, highlighting the significant water footprint of AI technologies.
The Impact on Water-Stressed Regions
The water usage of data centers is particularly concerning in regions already facing water scarcity. Sub-Saharan Africa, for example, is expected to see water demand nearly quadruple by 2030. We are seeing the impact of this already in Cape Town, where water scarcity is affecting daily lives of the population. Also, recent reports from Bangalore, in India, have been anything but less than alarming. The water table has fallen from 100 feet to 1800 feet per the reports. In the United States, about 20% of data centers rely on watersheds that are under moderate to high stress from drought. This puts additional pressure on local water resources, which are essential for both ecosystems and human consumption.
Corporate Water Stewardship
In response to these challenges, tech giants like Microsoft, Facebook (Meta), Amazon, and Google have pledged to become water positive by 2030, meaning they aim to replenish more water than they consume. These companies are exploring innovative cooling methods, such as adiabatic cooling, which uses outside air instead of water, and developing low-water alternatives that could reduce water usage by up to 50%.
Energy Consumption and Climate Change
Data centers are not only heavy water users but also significant energy consumers. They are expected to reach 35 gigawatts of power consumption annually by 2030, up from 17 gigawatts in the previous year. AI data centers could require more than five times the power of traditional facilities, with AI model training being particularly energy-intensive. This energy consumption contributes to greenhouse gas emissions, which exacerbate climate change.
The Carbon Footprint of AI
Training and Inference
Training large AI models is an energy-intensive process. For instance, GPT-3's training emitted 2200 tons of CO2 equivalent, which is comparable to 1600 return flights from Paris to New York. The monthly usage of ChatGPT equals 10,000 tons of CO2, contributing significantly to the yearly carbon footprint of individuals in countries like France and the UK. The impact of ChatGPT+, relying on GPT-4, could be even more substantial, potentially adding up to 10% to our current yearly carbon footprint.
Real-World Examples and Future Trends
The AI boom has led to a surge in data center construction, particularly in rural America, where tech giants are transforming farmlands into data center hubs. These facilities consume vast amounts of power, taxing energy grids and potentially hindering efforts to decarbonize the energy sector. The projected electricity consumption of data centers by 2030 is a staggering 1,000 terawatts, equivalent to Japan's total consumption.
Legislative and Industry Responses
Both the US and the EU are moving towards enforcing reporting requirements for data center energy use and environmental performance. This transparency is crucial for understanding and mitigating the environmental impact of data centers. Companies are also investing in renewable energy sources and efficiency improvements to reduce their carbon footprint.
Mitigating Environmental Impact
Leveraging Existing Models
One way to reduce the environmental impact of AI is by leveraging and fine-tuning existing generative models rather than building new ones from scratch. This approach can significantly reduce energy consumption and the associated carbon footprint.
Renewable Energy and Cooling Innovations
Tech giants are taking steps to address these concerns. Google, Microsoft, and Meta have pledged to replenish more water than they consume by 2030. Google has also reduced its data center carbon footprint by using water cooling and is developing new cooling solutions that could cut water use by up to 50%.
The Future of Sustainable AI
Technological Innovations
Innovations such as TinyML, which focuses on low-energy machine learning applications, can help conserve energy. Also, chips designed especially for training large language models, like Groq will help reduce the energy usage. Additionally, improvements in algorithms like 1.58Bit suggested by MS paper recently will help reduce the computations that will help result in less energy usage. I have provided details about electricity usage in my previous article here titled “The Energy Consumption of Large Language Models: Understanding and Reducing the Impact”.
Space-Based Data Centers
Futuristic solutions are also being explored. A startup called, Lonestar, has raised funds to build small data centers on the moon, and Thales Alenia Space is leading a study on the feasibility of building data centers in space, which would run on solar energy.
While the above mentioned solutions seem far-fetched and futuristic, I strongly believe that innovation has to be fostered in these areas to make these ideas a reality. Considering SpaceX taking strides towards human settlements on Mars, these private endeavors should be encouraged both by Governments and private-sectors alike.
Finally, I would like to conclude that the water and energy demands of data centers, particularly those powering AI technologies, pose significant environmental challenges. As these facilities consume resources at an alarming rate, the industry must prioritize sustainability to ensure that the digital economy does not come at the expense of the planet's health. Through innovative cooling technologies, water replenishment commitments, and energy efficiency, the tech industry can mitigate its impact on water resources and contribute to the fight against climate change.
Let’s all of us be aware of this and contribute our best to help the environment we live in and make the future better for our next generations.
Peace out ✌️
Sources
Data centres ‘straining water resources’ as AI swells (SciDev.Net)
Data centers, backbone of the digital economy, face water scarcity and climate risk (NPR)
Data centers are sprouting up as a result of the AI boom, minting fortunes, sucking up energy, and changing rural America (Business Insider)
Will AI queries increase Data Centre energy use by an order of magnitude? (Energy Post)
Cloud computing's real-world environmental impact (TechTarget)
Five areas of environmental impact in data centres (Network King)
As Use of A.I. Soars, So Does the Energy and Water It Requires (Yale Environment 360)
Our commitment to climate-conscious data center cooling (Google Blog)
Assess the environmental impact of data centers (TechTarget)
POWER OF AI: Wild predictions of power demand from AI put industry on edge (S&P Global Commodity Insights)