nuclear-news

The News That Matters about the Nuclear Industry Fukushima Chernobyl Mayak Three Mile Island Atomic Testing Radiation Isotope

AI’s insatiable energy demand is going nuclear

Yahoo.com, Rachelle Akuffo, Host, Mon, Aug 26, 2024

On the surface, the deal indicates Amazon’s ambitious expansion plans. But dig deeper, and the company’s purchase of a nuclear power facility speaks to a broader issue that Amazon and other tech giants are grappling with: the insatiable demand for energy from artificial intelligence.

In Amazon’s case, AWS purchased Talen Energy’s Pennsylvania nuclear-powered data center to co-locate its rapidly expanding AI data center next to a power source, keeping up with the energy demands that artificial intelligence has created.

The strategy is a symptom of an energy reckoning that has been building as AI has been creeping into consumers’ daily lives — powering everything from internet searches to smart devices and cars

Companies like Google (GOOGGOOGL), Apple (AAPL), and Tesla (TSLA) continue to enhance AI capabilities with new products and services. Each AI task requires vast computational power, which translates into substantial electricity consumption through energy-hungry data centers.

Estimates suggest that by 2027, global AI-related electricity consumption could rise by 64%, reaching up to 134 terawatt hours annually — or the equivalent of the electricity usage of countries like the Netherlands or Sweden.

This raises a critical question: How are Big Tech companies addressing the energy demands that their future AI innovations will require?

The rising energy consumption of AI

According to Pew Research, more than half of Americans interact with AI at least once a day.

Prominent researcher and data scientist Sasha Luccioni, who serves as the AI and climate lead at Hugging Face, a company that builds tools for AI applications, often discusses AI’s energy consumption.

Luccioni explained that while training AI models is energy-intensive — training the GPT-3 model, for example, used about 1,300 megawatt-hours of electricity — it typically only happens once. However, the inference phase, where models generate responses, can require even more energy due to the sheer volume of queries.

For example, when a user asks AI models like ChatGPT a question, it involves sending a request to a data center, where powerful processors generate a response. This process, though quick, uses approximately 10 times more energy than a typical Google search.

“The models get used so many times, and it really adds up quickly,” Luccioni said. She noted that depending on the size of the model, 50 million to 200 million queries can consume as much energy as training the model itself.

“ChatGPT gets 10 million users a day,” Luccioni said. “So within 20 days, you have reached that ‘ginormous’ … amount of energy used for training via deploying the model.”

The largest consumers of this energy are Big Tech companies, known as hyperscalers, that have the capacity to scale AI efforts rapidly with their cloud services. Microsoft (MSFT), Alphabet, Meta (META), and Amazon alone are projected to spend $189 billion on AI in 2024.

As AI-driven energy consumption grows, it puts additional strain on the already overburdened energy grids……………………………………………
https://finance.yahoo.com/news/ais-insatiable-energy-demand-is-going-nuclear-143234914.html

.

August 26, 2024 - Posted by | ENERGY

No comments yet.

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.