nuclear-news

The News That Matters about the Nuclear Industry Fukushima Chernobyl Mayak Three Mile Island Atomic Testing Radiation Isotope

We still don’t know how much energy AI consumes.

 We still don’t know how much energy AI consumes. Companies must give us
the chance to understand the environmental impact of the tech we use.

With every query, image generation and chatbot conversation, the energy that is
being consumed by artificial intelligence models is rising. Already,
emissions by data centres needed to train and deliver AI services are
estimated at around 3 per cent of the global total, close to those created
by the aviation industry.

But not all AI models use the same amount of
energy. Task-specific AI models like Intel’s TinyBERT and Hugging
Face’s DistilBERT, which simply retrieve answers from text, consume
minuscule amounts of energy — about 0.06 watt-hours per 1,000 queries.
This is equivalent to running an LED bulb for 20 seconds.

At the other extreme, large language models such as OpenAI’s GPT-4, Anthropic’s
Claude, Meta’s Llama, DeepSeek, or Alibaba’s Qwen use thousands of
times more energy for the same query. The result is like turning on stadium
floodlights to look for your keys. Why is there such an enormous difference
in energy consumption? Because LLMs don’t just find answers, they
generate them from scratch by recombining patterns from massive data sets.
This requires more time, compute and energy than an internet search.

 FT 20th May 2025, https://www.ft.com/content/ea513c7b-9808-47c3-8396-1a542bfc6d4f

May 22, 2025 - Posted by | Uncategorized

No comments yet.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.