Bringing You the Daily Dispatch

AI drive brings Microsoft’s ‘green moonshot’ down to earth in west London
Environment World News

AI drive brings Microsoft’s ‘green moonshot’ down to earth in west London

If you want evidence of Microsoft’s progress towards its environmental “moonshot” goal, then look closer to earth: at a building site on a west London industrial estate.

The company’s Park Royal datacentre is part of its commitment to drive the expansion of artificial intelligence (AI), but that ambition is jarring with its target of being carbon negative by 2030.

Microsoft says the centre will be run fully on renewable energy. However, the construction of datacentres and the servers they are filled with means that the company’s scope 3 emissions – such as CO2 related to the materials in its buildings and the electricity people consume when using products such as Xbox – are more than 30% above their 2020 level. As a result, the company is exceeding its overall emissions target by roughly the same rate.

This week, Microsoft’s co-founder, Bill Gates, claimed AI would help combat climate change because big tech is “seriously willing” to pay extra to use clean electricity sources in order “to say that they’re using green energy”.

In the short term, AI has been problematic for Microsoft’s green goals. Brad Smith, Microsoft’s outspoken president, once called its carbon ambitions a “moonshot”. In May, stretching that metaphor to breaking point, he admitted that because of its AI strategy, “the moon has moved”. It plans to spend £2.5bn over the next three years on growing its AI datacentre infrastructure in the UK and this year has announced new datacentre projects around the world including in the US, Japan, Spain and Germany.

Training and operating the AI models that underpin products such as OpenAI’s ChatGPT and Google’s Gemini uses a lot of electricity to power and cool the associated hardware, with additional carbon generated by making and transporting the related equipment.

“It is a technology that is driving up energy consumption,” says Alex de Vries, the founder of Digiconomist, a website monitoring the environmental impact of new technologies.

The International Energy Agency estimates that datacentres’ total electricity consumption could double from 2022 levels to 1,000 TWh (terawatt hours) in 2026, equivalent to the energy demand of Japan. AI will result in datacentres using 4.5% of global energy generation by 2030, according to calculations by research firm SemiAnalysis.

It means that amid the concerns about AI’s impact on jobs and humanity’s longevity, the environment is featuring, too. Last week, the International Monetary Fund said governments should consider imposing carbon taxes to capture the environmental cost of AI, in the form of a general carbon levy that captures emissions from servers as part of its reach, or other methods such as a specific tax on CO2 generated by that equipment.

All the big tech firms involved in AI – Meta, Google, Amazon, Microsoft – are seeking renewable energy resources to meet their climate targets. In January, Amazon, the world’s largest corporate purchaser of renewable energy, announced it had bought more than half the output of an offshore windfarm in Scotland, while Microsoft said in May it was backing $10bn (£7.9bn) in renewable energy projects. Google aims to run its datacentres entirely on carbon-free energy by 2030.

A spokesperson for Microsoft said: “We remain resolute in our commitment to meet our climate goals.”

Microsoft co-founder Bill Gates, who left in 2020 but retains a stake in the company via the Gates Foundation Trust, has argued that AI can directly help fight climate change. The extra electricity demand would be matched by new investments in green generation, he said on Thursday, which would more than compensate for the use.

A recent UK government-backed report agreed, stating that the “carbon intensity of the energy source is a key variable” in calculating AI-related emissions, although it adds that “a significant portion of AI training globally still relies on high-carbon sources such as coal or natural gas”. The water needed to cool servers is also an issue, with one study estimating that AI could account for up to 6.6bn cubic meters of water use by 2027 – nearly two-thirds of England’s annual consumption.

De Vries argues that the chase for sustainable computing power puts a strain on demand for renewable energy, which would result in fossil fuels picking up the slack in other sections of the global economy.

“More energy consumption means we don’t have enough renewables to feed that increase,” he says.

Server room in a datacentreView image in fullscreen

NexGen Cloud, a UK firm that provides sustainable cloud computing, a datacentre-reliant industry that delivers IT services such as data storage and computing power over the internet, says renewable energy sources for AI-related computing are available for datacentres if they avoid cities and are sited next to sources of hydro or geothermal power.

Youlian Tzanev, NexGen Cloud’s co-founder, says:

“The industry norm has been to build around economic hubs rather than sources of renewable energy.”

This makes it more difficult for any AI-focused tech company to hit carbon goals. Amazon, the world’s biggest cloud computing provider, aims to be net zero – removing as much carbon as it emits – by 2040 and to match its global electricity use with 100% renewable energy by 2025. Google and Meta are pursuing the same net zero goal by 2030. OpenAI, the developer of ChatGPT, uses Microsoft datacentres to train and operate its products.

There are two key ways in which large language models – the technology that underpins chatbots such as ChatGPT or Gemini – consume energy. The first is the training phase, where a model is fed reams of data culled from the internet and beyond, and builds a statistical understanding of language itself, which ultimately enables it to churn out convincing-looking answers to queries.

The upfront energy cost of training AI is astronomical. That keeps smaller companies (and even smaller governments) from competing in the sector, if they do not have a spare $100m to throw at a training run. But it is dwarfed by the cost of actually running the resulting models, a process known as “inference”. According to analyst Brent Thill, at the investment firm Jefferies, 90% of the energy cost of AI sits in that inference phase: the electricity used when people ask an AI system to respond to factual queries, summarise a chunk of text or write an academic essay.

The electricity used for training and inference is funnelled through an enormous and growing digital infrastructure. The datacentres are filled with servers, which are built from the ground up for the specific part of the AI workload they sit in. A single training server may have a central processing unit (CPU) barely more powerful than the one in your own computer, paired with tens of specialised graphics processing units (GPUs) or tensor processing units (TPUs) – microchips designed to rapidly plough through the vast quantities of simple calculations that AI models are made of.

If you use a chatbot, as you watch it spit out answers word by word, a powerful GPU is using about a quarter of the power required to boil a kettle. All of this is being hosted by a datacentre, whether owned by the AI provider itself or a third party – in which case it might be called “the cloud”, a fancy name for someone else’s computer.

SemiAnalysis estimates that if generative AI was integrated into every Google search this could translate into annual energy consumption of 29.2 TWh, comparable with what Ireland consumes in a year, although the financial cost to the tech company would be prohibitive. That has led to speculation that the search company may start charging for some AI tools.

But some argue that looking at the energy overhead for AI is the wrong lens. Instead, consider the energy the new tools can save. A provocative paper in Nature’s peer-reviewed Scientific Reports journal earlier this year argued that the carbon emissions of writing and illustrating are lower for AI than for humans.

AI systems emit “between 130 and 1,500 times” less carbon dioxide a page of text generated compared with human writers, the researchers from University of California Irvine estimated, and up to 2,900 times less an image.

Left unsaid, of course, is what those human writers and illustrators are doing instead. Redirecting and retraining their labour in another field – such as green jobs – could be another moonshot.

Source: theguardian.com