Photo: Pixabay
Artificial intelligence (AI) is becoming increasingly important in our daily lives, but its rapid expansion is raising concerns about how much energy it consumes. Google has become the first tech company to publish a report on the energy consumption, emissions, and water use of its AI software, Gemini.
Google estimates that the median Gemini text prompt uses 0.24 watt-hours (Wh) of energy, emits 0.03 grams of CO2 equivalent, and consumes 0.26 milliliters (or about five drops) of water. “The per-prompt energy impact is equivalent to watching TV for less than nine seconds,” according to a press release from the company.
When applying a non-comprehensive methodology, which only considers the consumption of active TPU and GPU chips, the median Gemini text prompt uses 0.10 Wh of energy, emits 0.02 gCO2e, and consumes 0.12 mL of water.
On the other hand, Google’s comprehensive methodology includes the energy and water consumption of the software itself, the operation of IT equipment in data centers, the energy used by chips while idle, as well as the amount of water used to cool the equipment.
It should be noted, however, that energy consumption depends on multiple factors, including prompt length, the number of users, and the model’s efficiency.
Google’s AI is becoming increasingly efficient thanks to innovations
Google claims that its consumption of energy and water for AI is “substantially lower than many public estimates.” It also stresses that its AI systems are becoming more efficient through research innovations and software and hardware efficiency improvements.
Over a recent 12-month period, the energy and total carbon footprint of the median Gemini Apps text prompt dropped by 33 times and 44 times, respectively, while delivering higher-quality responses, the company claims.
Google has announced that it will continue investing in technologies that reduce per-prompt energy and water use, as well as emissions associated with AI systems. By 2030, the company aims to achieve net-zero emissions and to replenish 120% of the freshwater consumed in its data centers and offices.
However, despite Google’s efforts to reduce emissions, they have soared 51% compared to 2019, driven by the expansion of data center capacities needed for training and running AI models.
By 2030, data centers could be consuming 4.5% of global electricity generation
Data centers are essential for the operation of AI systems, and the International Energy Agency (IEA) estimates that their total electricity consumption could double by 2026, reaching 1,000 TWh per year, equivalent to Japan’s entire annual electricity use.
According to research firm SemiAnalysis, the expansion of AI could lead to data centers using 4.5% of total global electricity generation by 2030.
Be the first one to comment on this article.