AI uses half a liter of water to write a 100-word email

1

One study estimated that to generate a 100-word email using artificial intelligence, a data center consumes 519 ml of water, which is almost the size of a small bottle, and 0.14 kWh, enough to keep 14 LED lights on for an hour.

The calculations were carried out by researchers at the University of California, Riverside, in partnership with The Washington Post. They considered a tool using OpenAI’s GPT-4 model.

This isn’t the first time a study has highlighted the amount of resources used by AI models. Researchers have previously pointed out that generating an image can consume up to 11.49 kWh, enough to charge 950 smartphones. Companies are also struggling with rising carbon emissions.

The amount of water and energy consumed by a data center can vary depending on a number of factors. Cooling systems are more necessary in hotter climates, for example. If there is little water available, air conditioning is the solution, which increases electricity consumption.

Google data center; company promises to achieve zero emissions by 2030 (Image: Disclosure / Google)

AI uses enough water to supply an entire state

To give an idea of ​​the impact this can have, the publication made some multiplications, involving the same task as a 100-word email.

Considering one email per week, in one year, the cost is:

  • 27 liters of water, which is more than a 20-liter gallon.
  • 7.5 kWh, the equivalent of the consumption of 9.3 homes in Washington, the capital of the United States, for one hour.

When you multiply this weekly email by 16 million people (10% of working Americans), the cost is:

  • 435 million liters, enough to supply all the homes in the state of Rhode Island, the smallest in the USA, for a day and a half.
  • 121 thousand kWh, the equivalent of the consumption of all the houses in the US capital for 20 days.

Scientists also estimated the water used to train AI models.

  • Microsoft’s data centers reportedly consumed 700,000 liters of water to train GPT-3. That’s the equivalent of the water used to produce 45 kg of meat, roughly twice what an American eats in a year.
  • Meta’s data centers reportedly consumed 22 million liters of water to train Llama-3. That’s the equivalent of the water used to produce two tons of rice, roughly the amount consumed by 164 Americans per year.

What companies say

The Washington Post reached out to major artificial intelligence companies. Google said it was sticking to its ambitious goals, which include achieving net-zero carbon emissions by 2030.

OpenAI acknowledges that AI can be energy-intensive and says it is constantly working to improve efficiency. Meta, for its part, says it operates its data centers sustainably and efficiently while ensuring access to its services.

Microsoft is also reinforcing its commitment to reducing resource extraction. The company says it is working on cooling methods that promise to completely eliminate water consumption.

Source: The Washington Post.

1 COMMENT

LEAVE A REPLY

Please enter your comment!
Please enter your name here