As tech giants including Meta, Amazon, and Google accelerate their generative AI deployments, a new energy analysis reveals artificial intelligence is poised to become the dominant force behind electricity consumption across North America.
The massive data centers powering these AI systems—housing thousands of computers that handle everything from training complex models to processing user requests for tools like ChatGPT and Sora—are consuming staggering resources. Beyond substantial electricity demands measured in megawatts, these facilities also require millions of gallons of water and occupy thousands of acres of land.
According to the comprehensive DNV energy transition report, global data center energy usage is projected to quintuple by 2040, reaching 5% of worldwide electricity consumption. AI-specific data centers will account for more than half of this growth, with North American facilities alone expected to consume 12% of the region’s electricity by 2040.
The report indicates that despite rapid AI expansion, the global energy transition continues to progress too slowly to meet Paris Agreement climate targets for achieving net-zero emissions and preventing dangerous warming this century.
Recent policy developments have further complicated the landscape. The Trump administration’s America’s AI Action Plan emphasized expedited data center construction, noting that “America’s environmental permitting system and other regulations make it almost impossible to build this infrastructure in the United States with the speed that is required.”
Despite regulatory changes in the US, DNV forecasts global emissions will decline 63% by 2060, though US-specific emissions reductions may be delayed by approximately five years. The analysis suggests that while the US influences global energy patterns, “massive scale decarbonization of the Chinese economy continues, coupled with low-cost electro-technology exports from China to other regions,” helping propel the worldwide clean energy transition.
Looking ahead, the exponential growth in AI’s power demands is expected to moderate over time. “We find that AI’s initial exponential growth in power demand will give way to a more linear pattern over time,” the report notes. Even by 2040, AI’s electricity requirements will remain smaller than those for electric vehicle charging and space cooling globally.
For additional context on this evolving energy landscape, see the detailed coverage of data center energy projections.