As Google has rushed to incorporate artificial intelligence into its core products - with sometimes less-than-stellar results - a problem has been brewing behind the scenes: the systems needed to power its AI tools have vastly increased the company's greenhouse gas emissions.
AI systems need lots of computers to make them work. The data centers needed to run them, essentially warehouses full of powerful computing equipment, suck up tons of energy to process data and manage the heat all of those computers produce.
The end result has been that Google's greenhouse gas emissions have soared 48 percent since 2019, according to the tech giant's annual environment report. The tech giant blamed that growth mainly on "increased data center energy consumption and supply chain emissions".
Now, Google is calling its goal to reach net-zero emissions by 2030 "extremely ambitious", and said the pledge is likely to be affected by "the uncertainty around the future environmental impact of AI, which is complex and difficult to predict". In other words: a sustainability push by the company - which once included the slogan "don't be evil" in its code of conduct - has gotten more complicated thanks to AI.
Google, like other tech rivals, has gone all-in on investing in AI, which is widely seen as the next major tech revolution that's poised to change how we live, work and consume information. The company has integrated its Gemini generative AI technology into some of its core products, including Search and Google Assistant, and CEO Sundar Pichai has called Google an "AI-first company".
But AI comes with a major downside: the power-hungry data centers that Google and other Big Tech rivals are currently spending tens of billions of dollars each quarter to expand in order to fuel their AI ambitions.
Illustrating just how much more demanding AI models are than traditional computing systems, the International Energy Agency estimates that a Google search query requires 0.3 watt-hours of electricity on average, while a ChatGPT request typically consumes about 2.9 watt-hours. An October study from Dutch researcher Alex de Vries estimated that the "worst-case scenario" suggests Google's AI systems could eventually consume as much electricity as the country of Ireland each year, assuming a full-scale adoption of AI in their current hardware and software.
"As we further integrate AI into our products, reducing emissions may be challenging due to increasing energy demands from the greater intensity of AI compute, and the emissions associated with the expected increases in our technical infrastructure investment," Google said in its report, published Monday. It added that data center electricity consumption is currently growing faster than it can bring carbon-free electricity sources online.
Google said it expects its total greenhouse gas emissions to continue to rise before falling, as the company seeks to invest in clean energy sources, such as wind and geothermal energy, to power its data centers.
The large amounts of water used as coolant needed to prevent data centers from overheating also presents a sustainability challenge. Google says it aims to replenish 120 percent of the freshwater it consumes in its offices and data centers by 2030; last year, it replenished just 18 percent of that water, although the amount was up sharply from six percent the year prior.
Google is also among companies experimenting with ways to use AI to fight climate change. A 2019 Google DeepMind project, for example, trained an AI model on weather forecasts and historical wind turbine data to predict the availability of wind power, helping to increase the value of the renewable energy source for wind farmers. The company has also used AI to suggest more fuel-efficient routes to drivers using Google Maps.
"We know that scaling AI and using it to accelerate climate action is just as crucial as addressing the environmental impact associated with it," Google said in the report.
CNN