AI & RoboticsNews

How AI’s energy hunger upends IT’s procurement strategy

A new whitepaper released last week by the Electric Power Research Institute (EPRI) quantifies the exponential growth potential of AI power requirements. The 35-page report titled, “Powering Intelligence: Analyzing Artificial Intelligence and Data Center Energy Consumption,” projects total data center power consumption by U.S. data centers alone could more than double to 166% by 2030.

According to EPRI, the demand is being driven largely by generative AI, which can require exponentially more power per query than traditional search. Notably, that’s not including images and other rich content: “At 2.9 watt-hours per ChatGPT request, AI queries are estimated to require 10x the electricity of traditional Google queries, which use about 0.3 watt-hours each; and emerging, computation-intensive capabilities such as image, audio, and video generation have no precedent.”

EPRI Energy Use per Model Image Credit: EPRI Report

The report studies five different use cases: Google search, ChatGPT, BLOOM and AI-powered Google search. Among these, ChatGPT was the least energy-intensive of the AI-based queries. However, the researchers anticipated the integration of Google’s AI capabilities into Google Search, noting it could be over 3x higher than ChatGPT: “If Google integrated similar AI into its searches, the electricity per search could increase to between 6.9–8.9 Wh.”

EPRI developed four distinct forecasts for potential electricity usage in U.S. data centers between 2023 and 2030, based on various annual growth scenarios: low (3.7%), moderate (5%), high (10%), and higher (15%). Under the higher growth scenario, data center electricity usage could rise to 403.9 TWh/year by 2030, a 166% increase from 2023 levels. Even the low growth scenario projects a 29% increase to 196.3 TWh/year.

EPRI U.S. Data Center Energy Consumption Projections 2023-20230 Image Credit: EPRI Report

The uneven geographic distribution of this growth creates localized challenges. Fifteen states accounted for 80% of the national data center load in 2023, with Virginia alone comprising 25%. Projections show Virginia’s data center share of total electricity consumption could reach 46% by 2030 under the higher growth scenario. Other states like Oregon, Iowa, Nebraska, North Dakota and Nevada are also projected to have data centers comprise 20% or more of total electricity demand.

Different types of data centers are contributing to this growth. Enterprise data centers, owned and operated by individual companies for their own use, account for 20-30% of the total load. Co-location centers, where businesses rent shared space and infrastructure, and hyperscale centers built by cloud giants like Amazon, Google and Microsoft, together account for 60-70% of load. Hyperscale centers in particular are at the forefront of energy innovations given their immense scale, with new centers being built with capacities from 100 to 1000 megawatts, equivalent to the load of 80,000 to 800,000 homes.

As the demand for AI-powered applications soars, enterprises are scrambling to secure the latest GPU-equipped servers from vendors like Nvidia. However, getting your hands on these cutting-edge machines is only half the battle. Even if you manage to procure the hardware on time, the power requirements for these energy-hungry systems are becoming an increasingly pressing concern. This means that the race to adopt AI isn’t just about acquiring the right hardware, data, and models—it’s also about ensuring you have the data center capacity–– we’re back to 1999 and the dotcom boom. 

In this environment, enterprises will need to start thinking more like their hyperscale competitors. Companies like Amazon, Google and Microsoft have long understood the importance of securing long-term data center capacity to support their ambitious growth plans. They often negotiate multi-year contracts with power providers, facilities operators, and contract manufacturers to lock in the resources they need to scale.

For enterprises, this may require a fundamental shift in how they approach data center procurement. Historically, many companies have relied on a “three bids and a buy” model, issuing RFPs and selecting the lowest-cost provider for each project. But in a world where data center capacity is increasingly constrained, and infrastructure equipment is a hot commodity, this approach may no longer be viable.

Instead, enterprises may need to start forging longer-term partnerships with data center and equipment providers, committing to a certain level of capacity over an extended period in exchange for guaranteed supply. This kind of supply chain agreement is already becoming more common in the industry, with some data center suppliers reportedly moving away from the traditional RFP process altogether.

“The data center equipment suppliers, many of them, aren’t even answering RFPs as much,” one industry executive told us on the condition of anonymity. “They’re moving towards a model where they deliver a certain capacity each month or quarter, and the company contracts that supply. Ten years ago 100% of our revenue was three bids and a buy. Today it’s 25%.”

For many enterprise IT leaders, this shift towards capacity contracting may require a new level of strategic thinking and long-term planning. This kind of proactive, forward-looking approach to data center planning won’t be easy. It will require close collaboration between IT, facilities, and finance teams, as well as a willingness to make significant upfront investments in infrastructure that may not pay off for years to come. But for enterprises that are serious about competing in an AI-driven future, it may be the only way to ensure they have the resources they need to stay ahead of the curve.

Transform 2024 returns this July! Over 400 enterprise leaders will gather in San Francisco from July 9-11 to dive into the advancement of GenAI strategies and engaging in thought-provoking discussions within the community. Find out how you can attend here.


A new whitepaper released last week by the Electric Power Research Institute (EPRI) quantifies the exponential growth potential of AI power requirements. The 35-page report titled, “Powering Intelligence: Analyzing Artificial Intelligence and Data Center Energy Consumption,” projects total data center power consumption by U.S. data centers alone could more than double to 166% by 2030.

According to EPRI, the demand is being driven largely by generative AI, which can require exponentially more power per query than traditional search. Notably, that’s not including images and other rich content: “At 2.9 watt-hours per ChatGPT request, AI queries are estimated to require 10x the electricity of traditional Google queries, which use about 0.3 watt-hours each; and emerging, computation-intensive capabilities such as image, audio, and video generation have no precedent.”

EPRI Energy Use per Model Image Credit: EPRI Report

The report studies five different use cases: Google search, ChatGPT, BLOOM and AI-powered Google search. Among these, ChatGPT was the least energy-intensive of the AI-based queries. However, the researchers anticipated the integration of Google’s AI capabilities into Google Search, noting it could be over 3x higher than ChatGPT: “If Google integrated similar AI into its searches, the electricity per search could increase to between 6.9–8.9 Wh.”


Transform 2024 Registration is Open

Join enterprise leaders in San Francisco from July 9 to 11 for an exclusive AI event. Connect with peers, explore the opportunities and challenges of Generative AI, and learn how to integrate AI applications into your industry. Register Now


An emerging supply constraint

EPRI developed four distinct forecasts for potential electricity usage in U.S. data centers between 2023 and 2030, based on various annual growth scenarios: low (3.7%), moderate (5%), high (10%), and higher (15%). Under the higher growth scenario, data center electricity usage could rise to 403.9 TWh/year by 2030, a 166% increase from 2023 levels. Even the low growth scenario projects a 29% increase to 196.3 TWh/year.

EPRI U.S. Data Center Energy Consumption Projections 2023-20230 Image Credit: EPRI Report

The uneven geographic distribution of this growth creates localized challenges. Fifteen states accounted for 80% of the national data center load in 2023, with Virginia alone comprising 25%. Projections show Virginia’s data center share of total electricity consumption could reach 46% by 2030 under the higher growth scenario. Other states like Oregon, Iowa, Nebraska, North Dakota and Nevada are also projected to have data centers comprise 20% or more of total electricity demand.

Different types of data centers are contributing to this growth. Enterprise data centers, owned and operated by individual companies for their own use, account for 20-30% of the total load. Co-location centers, where businesses rent shared space and infrastructure, and hyperscale centers built by cloud giants like Amazon, Google and Microsoft, together account for 60-70% of load. Hyperscale centers in particular are at the forefront of energy innovations given their immense scale, with new centers being built with capacities from 100 to 1000 megawatts, equivalent to the load of 80,000 to 800,000 homes.

Flipping the script on data center procurements

As the demand for AI-powered applications soars, enterprises are scrambling to secure the latest GPU-equipped servers from vendors like Nvidia. However, getting your hands on these cutting-edge machines is only half the battle. Even if you manage to procure the hardware on time, the power requirements for these energy-hungry systems are becoming an increasingly pressing concern. This means that the race to adopt AI isn’t just about acquiring the right hardware, data, and models—it’s also about ensuring you have the data center capacity–– we’re back to 1999 and the dotcom boom. 

In this environment, enterprises will need to start thinking more like their hyperscale competitors. Companies like Amazon, Google and Microsoft have long understood the importance of securing long-term data center capacity to support their ambitious growth plans. They often negotiate multi-year contracts with power providers, facilities operators, and contract manufacturers to lock in the resources they need to scale.

For enterprises, this may require a fundamental shift in how they approach data center procurement. Historically, many companies have relied on a “three bids and a buy” model, issuing RFPs and selecting the lowest-cost provider for each project. But in a world where data center capacity is increasingly constrained, and infrastructure equipment is a hot commodity, this approach may no longer be viable.

Instead, enterprises may need to start forging longer-term partnerships with data center and equipment providers, committing to a certain level of capacity over an extended period in exchange for guaranteed supply. This kind of supply chain agreement is already becoming more common in the industry, with some data center suppliers reportedly moving away from the traditional RFP process altogether.

“The data center equipment suppliers, many of them, aren’t even answering RFPs as much,” one industry executive told us on the condition of anonymity. “They’re moving towards a model where they deliver a certain capacity each month or quarter, and the company contracts that supply. Ten years ago 100% of our revenue was three bids and a buy. Today it’s 25%.”

For many enterprise IT leaders, this shift towards capacity contracting may require a new level of strategic thinking and long-term planning. This kind of proactive, forward-looking approach to data center planning won’t be easy. It will require close collaboration between IT, facilities, and finance teams, as well as a willingness to make significant upfront investments in infrastructure that may not pay off for years to come. But for enterprises that are serious about competing in an AI-driven future, it may be the only way to ensure they have the resources they need to stay ahead of the curve.





Author: James Thomason
Source: Venturebeat
Reviewed By: Editorial Team
Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!