Artificial intelligence systems require a lot of computing power and electricity to run. Current estimates are that data centers globally use 1-1.5% of global electricity, and AI could substantially increase that demand. One analysis found that if Google Search was converted to a chatbot like ChatGPT, it would consume as much electricity as all of Ireland.
Both the training phase, where models learn, and the inference phase, where they generate new outputs, take a lot of energy. Larger neural network models with more parameters tend to be more energy intensive.
Cooling for data centers can add another 50%+ to electricity costs. The type of hardware used, the complexity of the task, and where servers are located also impact energy use.
If AI adoption grows rapidly in the next few years, its electricity use could see up to a 10x increase globally. But more realistic projections are for steady but slower growth initially. More transparency and data are needed from tech companies on the energy use of their AI systems.
Phases of AI are energy intensive. Factors like model size, data volume, hardware efficiency, and cooling needs all impact energy consumption. More complex AI tasks require more processing and thus more electricity.
It’s unlikely AI energy use will explode 10-fold, as in a worst-case scenario, but some significant growth seems probable. Efficiency gains may help but typically spur more demand, rather than reducing net energy use.
More transparency around AI energy use is needed. Developers should be required to disclose energy consumption, as data availability is currently very limited. This could help inform efforts to make AI more energy-efficient and sustainable.
AI’s potential environmental impact merits more attention alongside concerns about its accuracy, biases, and transparency. Considering sustainability issues now allows time to responsibly shape AI’s future energy footprint.