Artificial intelligence (AI), particularly in the form of generative AI large language models such as ChatGPT, Gemini and Copilot, is rarely out of the news.
These systems can produce effective text, answer complex questions and generate stunning images. But the software also presents challenges for safe business use. In three posts, we are exploring overcoming errors and hallucinations, environmental impact and copyright issues.
Traditionally, industry and retail have been the main business-related energy users, but information technology is now a major player, and few innovations have required more extra power than artificial intelligence. This has a potential impact for companies and individuals wishing to make the most of AI’s abilities, but concerned about their carbon footprint. However, once we examine specifics, the publicity given to the environmental impact of AI can feel exaggerated.
Estimates suggest that by 2022, between 1 and 1.3 per cent of total worldwide electricity consumption (240-340 TWh) was by data centres. This is not just to power the technology, but also to handle the cooling required to deal with the heat generated. The problem is particularly clear in heavy-use countries like Ireland, where data centres make up nearly a fifth of all consumption. By comparison, in the US in 2023, data centres were responsible for about 4.4% of electricity use, but Lawrence Berkeley National Laboratory predicts that this will increase to between as much as 12% of total US use in just three years’ time. And this is for the second-largest electricity-consuming country in the world.
AI is certainly pushing up the energy demands of data centres, but estimating the impact is not easy.
According to Google’s 2024 Environmental Report, their greenhouse gas emissions grew by 13% in 2023, up 48% on 2019, though they can’t (or won’t) pin down how much of this came from AI. It has been estimated that a search dependent on a large language model could use about ten times as much energy as a traditional search engine query. The disparity can be even greater for some tasks.
A 2024 paper (Power Hungry Processing: Watts Driving the Cost of AI Deployment) suggested that, in some cases, generative AIs could have 33 times the energy demand of specialist software – this is particularly the case with image generation and summarising roles. The extra consumption is, in part, because querying such a large language model uses a lot of energy just to parse the query and collect input from a wide range of sources, on top of the energy needed to train the model.
Many current AI developments also make use of the latest, most energy-demanding specialist hardware.
Chinese AI company DeepSeek claims that its system uses 50 to 75% less energy than servers incorporating the newest AI chips (though this has not been independently verified). Efficiency improvements may be at play here: DeepSeek claims its servers use around 1/8th of the number of chips required by a US large language model.
The environmental AI news has one potential positive, though. The Boston Consulting Group has suggested that AI ‘has the potential to help mitigate 5-10% of global greenhouse gas emissions.’ This would involve a combination of enhanced analysis to suggest sustainability improvements, better prediction to trigger interventions and optimising systems from smart thermostats to power grids. It should be pointed out, though, that the wording ‘has the potential to help’ does not guarantee any contribution whatsoever.
When we examine the numbers, media panic over AI electricity use appears overblown. Bear in mind that even a 100% increase of consumption by data centres is only around an extra 1% of world consumption. We have far greater demands on electricity than data centres – and other uses will only grow in proportion as electric vehicles, and electric replacement of industrial processes currently making heavy use of fossil fuels, such as steel-making, become more widespread.
When the majority of electricity production is from green sources, from wind and solar to nuclear, this will no longer be an issue. For the moment, though, it is certainly a concern to be noted, if not a deterrent from making sensible use of AI.