A recent study warns that the artificial intelligence (AI) industry could potentially consume as much energy as a country the size of the Netherlands by 2027.
This surge is attributed to the rapid integration of AI-powered services by major tech companies, particularly since the emergence of ChatGPT last year. Unlike conventional applications, these AI-driven services demand considerably more power, significantly heightening the energy intensity of online activities.
Nonetheless, the study suggests that the environmental impact of AI might be less severe if its current growth rate were to slow down. Nevertheless, many experts, including the report's author, emphasize that such predictions are speculative due to the lack of sufficient data disclosure from tech firms, making accurate forecasts challenging.
Without a doubt, AI necessitates more robust hardware compared to traditional computing tasks. The study, conducted by Alex De Vries, a PhD candidate at the VU Amsterdam School of Business and Economics, is contingent on certain parameters remaining constant. These include the rate at which AI advances, the availability of AI chips, and the continuous operation of servers at full capacity.
De Vries notes that Nvidia, a chip designer, is estimated to supply approximately 95% of the required AI processing equipment. By estimating the quantity of these computers projected to be delivered by 2027, he approximates an energy consumption range for AI between 85 and 134 terrawatt-hours (TWh) annually. At the higher end, this is roughly equivalent to the energy consumption of a small country, akin to the Netherlands.
De Vries stresses that his findings underscore the importance of using AI only in cases where it is genuinely necessary. His peer-reviewed study has been published in the journal Joule.
AI systems, such as the sophisticated language models underpinning popular chatbots like OpenAI's ChatGPT and Google's Bard, require specialized computer warehouses known as data centers.
Consequently, this equipment consumes more power and, like conventional setups, necessitates substantial water usage for cooling. The study did not incorporate the energy required for cooling, an aspect often omitted by major tech companies in their disclosures.
Despite this, the demand for AI-powered computers is surging, along with the energy required to maintain these servers at optimal temperatures.
Notably, companies are showing a growing interest in housing AI equipment within data centers. Danny Quinn, CEO of Scottish data center firm DataVita, highlights the significant disparity in energy consumption between racks containing standard servers and those housing AI processors.
In its recent sustainability report, Microsoft, a company heavily investing in AI development, revealed a 34% surge in water consumption between 2021 and 2022. This amounted to 6.4 million cubic meters, roughly equivalent to 2,500 Olympic swimming pools.
Professor Kate Crawford, an authority on AI's environmental impact, underscores the monumental energy and water requirements of these high-powered AI systems. She emphasizes that these systems constitute a substantial extractive industry for the 21st century, with enormous implications for resource usage.
While AI's energy demands present a challenge, there are also hopes that AI can contribute to solving environmental problems. Google and American Airlines, for instance, have recently found that AI tools can reduce aircraft contrails, a contributor to global warming.
Additionally, the U.S. government is investing millions in advancing nuclear fusion research, where AI could accelerate progress in achieving a limitless, green power supply. This year, a university academic reported a breakthrough in harnessing immense power through AI-driven prediction in an experiment, offering promise for future sustainable energy solutions.