Earlier this month Amazon successfully lobbied against a bill in Oregon, US, that aimed to curb carbon emissions from data centres, arguing it failed to address the need for more clean energy infrastructure.
Critics say the company is failing to act on a pledge to decarbonise a key part of its operations. Data centres are extremely energy-intensive and account for nearly 1 per cent of energy-related greenhouse gas emissions, says the IEA.
Big tech companies are a big part of that problem – and it could get a lot worse as they seek to build their own advanced AI.
So what? The race to build generative AI products like Open AI’s ChatGPT is already using vast quantities of energy and water to power data centres. Scientists estimate that the training of the model behind OpenAI’s chatbot consumed 700,000 litres of water alone – enough to manufacture 370 BMW cars.
As AI booms, energy and water demand could grow exponentially. The computing power used to train AI models has increased by a factor of one hundred million in the past decade and it’s been impossible for processing units to keep pace in terms of energy efficiency.
But transparency about environmental impact remains an afterthought for the main companies involved.
Carbon. Training GPT-3, an AI programme that generates language, required new levels of computing power. According to a 2021 study, it used 1.287 gigawatt hours of electricity and produced as much carbon as 110 US cars emit in a year. GPT-4, OpenAI’s newest model, likely demanded even more.
“It’s like the difference between a plane and a rocket,” says Vincent Thibault, co-founder of QScale. “They both fly, but the rocket expends 10 times more energy per second to go faster. That’s the difference between storing and computing data.”
High energy use doesn’t have to mean high emissions. But data centres used for large projects don’t tend to run on renewable energy: 60 per cent of all US data centres are located on 1,304 square miles of land in northern Virginia where coal is the go-to source for generating electricity.
Google and Microsoft both have plans to incorporate generative AI into search engines, requiring yet more energy-intensive training. Neither has disclosed exactly what kind of electricity they are planning to use to achieve this aim. Researchers found that AI made up to 15 per cent of Google’s total electricity consumption in 2021 – equivalent to powering every home in the US city of Atlanta for a year.
Both companies have promised to become carbon-free or carbon negative energy users by 2030.
Water. “Right now the concern and metrics used by these companies is focused on the carbon footprint. They aren’t considering the water footprint yet,” says Shoalei Ren, professor of computer engineering at the University of California, Riverside.
Estimates from his research suggest that a conversation of between 20-50 questions with one AI chatbot could “drink” a “500ml bottle of water”. There are already signs that humans will begin competing with machines for potable water in
The solution? Move the training of large scale AI models to places where the air is cool and the energy is clean. “It makes a lot more sense to move data to where the power is, instead of moving power to where the data is,” says Thibault, who has pioneered one of the first purpose-built data centres for AI in Quebec. He argues that excess heat from AI processing could eventually be used to heat homes or grow vegetables.
The upshot. There is some hope that despite its environmental impact, advanced AI systems could help in the fight against climate change. Advanced AI has huge scope to optimise transport and energy systems and further scientific discovery.
Still, pausing AI for a summer might not be such a bad idea, especially when keeping machines cool comes at such a cost.