AI’s energy demand will increase by an average of 70% annually until 2027, driven primarily by the growth of data centers.
Artificial intelligence is permeating more and more areas of our lives every day. From intelligent assistants to automated driving systems, AI technology is advancing rapidly. But this progress comes at a cost: energy consumption. The computing power required for the rise of AI is doubling every 100 days.
Managing AI’s energy demand is critical for a sustainable future. Limiting energy use in the training and inference phases of AI models could reduce energy consumption by 12% to 15%. This is just one of the steps needed for AI to evolve in a sustainable way.
The role of data centers
The energy consumption of data centers to support AI technology is increasing. For example, training large language models (LLMs) can take months and cost millions of dollars. AI data centers consume four times more electricity than data centers hosting cloud applications.
AI’s energy demand will increase by an average of 70% annually through 2027, driven primarily by the growth of data centers. Energy providers, especially regulated utilities, will likely focus on developing renewable energy and storage projects to meet this demand.
Source: https://www.cioupdate.com.tr/teknoloji/yapay-zeka/yapay-zeka-20/