© 2024 AIDIGITALX. All Rights Reserved.

The Growing Energy Demands of AI and Data Centers

Data centers that power AI systems already account for around 1-1.5% of global electricity use. Continued growth in AI adoption could drive up energy demand substantially. One estimate suggests NVIDIA shipping 1.5 million AI servers by 2027 could use over 85 terawatt-hours annually, more than many small countries consume.
AI energy consumption
Illustration: AI energy consumption

Artificial intelligence systems require a lot of computing power and electricity to run. Current estimates are that data centers globally use 1-1.5% of global electricity, and AI could substantially increase that demand. One analysis found that if Google Search was converted to a chatbot like ChatGPT, it would consume as much electricity as all of Ireland.


Both the training phase, where models learn, and the inference phase, where they generate new outputs, take a lot of energy. Larger neural network models with more parameters tend to be more energy intensive.

Cooling for data centers can add another 50%+ to electricity costs. The type of hardware used, the complexity of the task, and where servers are located also impact energy use.

If AI adoption grows rapidly in the next few years, its electricity use could see up to a 10x increase globally. But more realistic projections are for steady but slower growth initially. More transparency and data are needed from tech companies on the energy use of their AI systems.

Phases of AI are energy intensive. Factors like model size, data volume, hardware efficiency, and cooling needs all impact energy consumption. More complex AI tasks require more processing and thus more electricity.


It’s unlikely AI energy use will explode 10-fold, as in a worst-case scenario, but some significant growth seems probable. Efficiency gains may help but typically spur more demand, rather than reducing net energy use.

More transparency around AI energy use is needed. Developers should be required to disclose energy consumption, as data availability is currently very limited. This could help inform efforts to make AI more energy-efficient and sustainable.

AI’s potential environmental impact merits more attention alongside concerns about its accuracy, biases, and transparency. Considering sustainability issues now allows time to responsibly shape AI’s future energy footprint.

NewsletterYour weekly roundup of the best stories on AI. Delivered to your inbox weekly.

By subscribing you agree to our Privacy Policy & Cookie Statement and to receive marketing emails from AIDIGITALX. You can unsubscribe at any time.

Jessica Wong
Jessica Wong

Jessica Wong is a data scientist and author with a flair for demystifying AI concepts. Known for making complex topics accessible. Aiming to bridge the AI knowledge gap.