Producing AI infrastructure
It's easy to think of artificial intelligence as something exclusively digital — something intangible. And in some ways, it is. Most of our interactions with AI take place online, as we communicate with web-based, cloud-hosted solutions.
But AI still depends on physical infrastructure. It depends on microchips to provide processing power, and on data centres to support and train large language models (LLMs). And this physical infrastructure represents a challenge.
Producing computer hardware requires energy and resources — resources which are in a finite supply. Microchips are produced using rare earth elements, and the global demand for these rare earth elements is expected to more than double between 2023 and 2050. While this demand is not exclusively down to artificial intelligence, the growth of AI is certainly a key driver.
Managing e-waste
Another significant challenge arises from what happens to this infrastructure once it is produced — or, more accurately, once it is no longer in use. AI technology is advancing rapidly, and computing devices will generally become obsolete after only two to five years. This leaves us with a mounting problem — electronic waste, or e-waste.
A study conducted by Israel's Reichman University suggested that generative AI could contribute 5 million metric tonnes of e-waste between 2024 and 2030. It's worth noting that more than 60 million metric tonnes of e-waste is currently produced every year, and 5 million tonnes over six years is a relatively small amount.
Maybe so, but it's still significant. Waste will need to be managed carefully, particularly as AI adoption accelerates.
Running AI systems
Even while systems and hardware are still being used, sustainability can be a challenge. A study published by the Internal Energy Agency in 2024 showed that a ChatGPT request required 2.9 Wh of electricity to complete, compared to 0.3 Wh for a standard Google search. This is a serious disparity, and is going to greatly increase global energy demands as generative AI requests grow increasingly common.
So why does generative AI require so much power to run? Largely because of the huge amounts of data being processed. GenAI is generating a fresh response based on its analysis of an enormous dataset. Google searches, on the other hand, are simply providing lists of results based on the user query.
There are other sustainability issues with running AI systems too. AI solutions require extensive data centres, and these data centres tend to get very hot. To stay operational, they must be cooled, which in turn requires large amounts of water. It is estimated that as many as 9 litres of water are used up for every kWh of energy used by the data centre.