Most of us don't think much about the electricity usage and water usage of our digital devices and the work we do on them, but every keystroke, internet search, or interaction with ai data centers consumes power. Until now, most tasks we accomplish with our phones and computers have been quite economical in terms of power consumption. However, that is changing with the introduction of the AI boom, deepening reliance on artificial intelligence algorithms, and the continued rise of AI hardware.
Generative AI, which is covered more in depth in this article, is a relatively new kind of software application that can create digital content such as text and imagery. To create these powerful systems requires a lot of electricity, and using them is also more energy-intensive than more traditional forms of computing, contributing to increasing environmental impact of AI.
The first step in creating sophisticated software systems is training. During this process, large quantities of data are analyzed to improve the deep learning models' ability to detect and predict patterns. The latest models require data sets as large as significant portions of the internet. Processing this much information needs thousands of the world's most powerful chips working together in huge data centers. It is estimated that training the AI model for the first version of ChatGPT, launched in November 2022, used roughly the same amount of electricity as 130 average U.S. households consume in a year, highlighting the significant power demand.
To push these systems to improve, companies are increasing both the size of the training datasets and the amount of computation used to process the data, leading to a related carbon footprint. Each new model is much larger and requires exponentially more computing power to create, increasing the environmental cost.
Moreover, the energy consumption doesn't stop at the training phase. Once deployed, these models continue to consume significant amounts of electricity to generate responses, perform tasks, or maintain availability for user queries. The operational energy usage of these AI systems can be substantial, especially as they scale to meet growing demands.
For example, a typical search engine query requires about .3 watt-hours of electricity, while a generative AI prompt requires roughly ten times as much power to produce an output. To put this into perspective, consider a common household appliance like a refrigerator. A typical refrigerator uses about 1 to 2 kilowatt-hours (kWh) of electricity per day. Let's take an average value of 1.5 kWh for our calculations.
An online search uses about 0.3 watt-hours (Wh) of electricity.
A generative AI prompt uses about 3 Wh of electricity.
Therefore, a day of refrigerator use is equivalent to approximately 5000 online searches or 500 generative AI prompts.
While a single generative AI prompt's electricity use seems minuscule, when multiplied by billions of users and countless daily interactions, the cumulative effect becomes significant. For instance, if a billion people each made just one generative AI query per day, the total electricity consumption would be equivalent to running over 2 million refrigerators continuously, indicating the enormous amounts of energy involved.
One way to approach this question is to look at the energy consumption and environmental impact of AI and data centers, massive warehouses containing the actual hardware that runs our modern digital infrastructure. A typical data center is a large facility designed to house a vast array of computer servers, storage devices, and networking equipment. These centers are engineered to provide a controlled environment that ensures optimal performance and reliability of the hardware. Key features of data centers include:
Data centers are critical hubs of our digital world, consuming vast amounts of electricity to support the ever-growing demands of AI products and services. As of now, there are thousands of data centers spread across the globe, with major hubs in the United States, Europe, and Asia. These data centers range from small server rooms to massive facilities covering millions of square feet.
According to industry reports, the number of data centers is expected to grow substantially in the coming years. This growth is driven by the escalating demand for cloud services, data storage, and computational power required to support emerging technologies like generative AI applications. Estimates suggest that the global data center market could expand by several hundred new facilities annually, each consuming large amounts of electricity and contributing to the overall carbon footprint.
In 2021, data centers accounted for between 0.9% and 1.3% of global energy consumption, a number expected to grow to 1.86% by 2030. Many leading data center operators have committed to powering their operations entirely from renewable sources within a few years. This is an important effort given our rapidly increasing reliance on them as a critical part of our energy infrastructure.
While generative AI systems currently represent only a small fraction of data center workloads, they require significant computing power and electricity. Given the likelihood that generative AI features will be integrated into many common digital resources, it is probable that future data center creation and energy consumption will largely be driven by the development and deployment of increasingly large and powerful generative AI systems.
Understanding the environmental impact of AI technologies requires examining the supply chains for the raw materials used in manufacturing the hardware essential for AI systems. Key components like semiconductors and batteries rely on metals such as lithium, cobalt, and rare earth elements. Mining and processing these materials can lead to significant environmental degradation, including deforestation, soil erosion, and water contamination.
The environmental impact of raw material extraction extends to public health concerns in communities near mining operations. Pollutants in air and water can lead to respiratory issues, waterborne diseases, and other health problems.
The complexity of global supply chains presents significant obstacles to achieving transparency. Many companies struggle to trace the origins of their materials, making it difficult to ensure ethical and sustainable practices.
Implementing comprehensive strategies for monitoring and improving supply chain transparency is essential to mitigate the environmental and public health impacts associated with AI technologies. Emphasizing responsible sourcing and promoting international cooperation can make a significant difference in reducing these issues.
Like many things in life — it depends. Imagine a teacher needs to create a multiple-choice exam for a chapter her students just covered. Using a generative AI tool, the teacher can create the quiz in seconds and then spend a few minutes reviewing it to ensure everything is accurate and appropriate.
Using AI is an excellent choice in this situation because the teacher can verify the accuracy of the information. The quiz format is simple for the AI to generate, and the teacher saves time by not having to come up with questions and incorrect answers. What might have taken several hours now takes just a few minutes.
Despite generative AI being an energy-intensive form of computing, using it in the right situations can be quite beneficial. According to a study in Nature, using generative AI to create text or images can produce 130 to 2,900 times less CO2 than humans doing the same tasks. If that same teacher were to sit in the office for hours writing a quiz with the lights and air conditioner running the whole time, it could be argued that using AI is a better and more environmentally sound option.
The inverse is also possible. Let’s say you need to write a thank you note and you spend several minutes asking an AI model for dozens of different versions, none of which you end up using. In this case, a significant amount of resources were expended for little value, and it may have been more effective and environmentally friendly to simply sit down with a pen, think for a moment, and write something from the heart.
Just as there's no definite rule for deciding whether to walk or drive somewhere, there's no clear way to know when to use generative AI. The most important thing is to consider the energy these tools consume, including their potential impact on global temperatures and extreme weather events, and to develop a sense of when using them is both efficient and effective.
As AI technology advances, it opens opportunities to contribute positively to the environment and improve energy efficiency. Here are some examples and potential benefits:
By focusing on these areas, AI not only reduces its environmental impact but also plays a crucial role in fostering sustainable practices across various sectors. As AI technology becomes more efficient, its potential to contribute positively to environmental goals will continue to grow.
While modern automation offers remarkable capabilities and efficiencies, it also comes with substantial environmental impact of AI. The electricity and water usage required to train and operate these advanced models is significant, leading to increased carbon emissions and energy consumption. Data centers, which are the backbone of modern AI infrastructure, contribute to a growing carbon footprint that necessitates the adoption of more sustainable practices. As generative AI continues to integrate into various digital resources, it is crucial to balance its benefits with its environmental impact.
However, the potential for generative AI to optimize tasks and reduce human labor cannot be overlooked. When used judiciously, generative AI can be a powerful tool for saving time and resources. The key lies in discerning the appropriate contexts for its use, ensuring that its deployment maximizes efficiency and minimizes unnecessary energy expenditure. As we move forward, it will be essential to develop best practices for utilizing generative AI in ways that are both environmentally and operationally sustainable, possibly contributing toward a net carbon-negative AI ecosystem if planned emissions reductions are successfully implemented.