Key Takeaways
- The rapid innovation and adoption of AI are fueling a sharp increase in the use of energy in data centers being built or retrofitted to run power-hungry AI workloads.
- The pressure to control the spiraling power consumption and related costs isn't just hitting big cloud providers but also the data centers of enterprises, which are quickly embracing the technology.
- There are a range of steps data center administrators can take, from reducing the amount of power going into AI system components to rethinking cooling options and AI model sizes.
- AI will also play a role in making data centers and the systems that run the AI workloads more energy-efficient.
Global power consumption by data centers was relatively stable for a number of years, with vendors running out of increasingly energy-efficient technologies, from servers and storage systems to CPUs, GPUs, networking, and other components, to handle the growing numbers of more complex workloads.
Then came OpenAI’s ChatGPT in late 2022, igniting a push for highly dense data centers that will need significantly more electricity to run these workloads. Goldman Sachs Research expects AI power consumption to drive a 50% jump in global data center energy demand between 2023 and 2027, with a 165% increase by the end of the decade.
Right now, Goldman Sachs researchers say data centers worldwide consume about 55 gigawatts of electricity, though it will likely grow to 84 gigawatts by 2027 as AI workloads become a larger part of the mix.
Big numbers, big challenges
Such numbers are staggering. According to the International Energy Agency (IEA), processing data—primarily for AI workloads—by 2030 in the United States will require more electricity than other power-intensive jobs like manufacturing steel, cement, and chemicals combined. Today, AI data centers account for 1% to 2% of worldwide energy demand. In the AI era, as both corporate and consumer demand increases, that could hit 21% by 2030.
There are other culprits, such as cryptocurrency and electric vehicles, but AI is the key driver of such numbers.
Major cloud providers like Amazon Web Services, Microsoft, and Google are pledging to spend billions of dollars to build massive data centers to run AI workloads, which consume significantly more power than traditional applications. For example, the average ChatGPT query eats up almost 10 times as much electricity as a Google search. Also, AI workloads are run on servers powered by GPUs from Nvidia and others, which consume more electricity than typical data center systems, and these machines have to process massive amounts of data and parameters in the billions.
These servers do a lot more work with AI, so they need more power.
Money pouring in to fix the problem
The big cloud players are planning for the future. Amazon is promising $100 billion over 10 years to build data centers capable of meeting the increasing data center power requirements of AI workloads, while Google is looking at $75 billion and Meta up to $65 billion. Microsoft will spend $80 billion for AI data centers this year, and is considering reopening the notorious Three Mile Island power plant to run some of its data centers.
That said, you don’t have to be a hyperscaler to be concerned about the spiking trend in data center power consumption. AI is the driver behind most of this, and enterprises are rapidly adopting the still-emerging technology with no end in sight. In a survey, McKinsey and Company found that 78% of respondents said their organizations are using AI in at least one business function. And it promises to go up.
If you’re managing a corporate data center now, you may not be looking for a mothballed nuclear power plant to buy, but concerns about electricity use are likely in your head. While early AI workloads may have run in the cloud, issues like data security and sovereignty, customization, and more predictable costs will keep more of them in-house.
Tapping the brakes
If that’s the case, what steps can corporations take to slow the rising AI power consumption and associated costs as enterprise use of AI workloads expands?
Here are a few ideas to consider:
- Look at power use: Instead of letting the CPUs and GPUs reach 100% of their power use, cap that use at something like 60% or 80% so that they're getting less power fed to them.
- Update your equipment: Newer data center servers and other systems are more power-efficient, which would save money.
- Rethink how you use AI models: There is a big range in the number of parameters AI models run. Google's PaLM large language model (LLM) comes with 540 billion parameters, while Mistral 7B, is a small language model (SLM) with a much more efficient 7 billion parameters.
- Cooling options: Many data centers run huge AC units to keep data centers cool. However, system vendors like Dell, HPE, Lenovo, and Nvidia offer servers that use liquid coolants to absorb and dissipate heat from components, and liquid cooling has as much as 3,000 times the efficiency of air cooling.
- Use algorithms: Similar to adopting SMLs, fewer data points can be used to train AI models, and software can be optimized to reduce electricity consumption.
- Renewable energy: Where possible, consider adding solar, wind, or hydroelectric power into the mix to reduce the amount of electricity needed in your data center.
AI to the rescue?
Even as you contemplate these steps, there is another tool that can address the coming AI-driven data center energy crunch: AI. The technology, with all of its optimization, predictive, and problem-solving capabilities, can be focused on the challenge of data center power consumption.
According to the IEA report, energy and other sectors are already using AI to make their operations more efficient and can revamp electricity grids to more easily accept renewable energy into the grids and balance the distribution of electricity between what is generated via fossil fuels and what is created from wind, solar, and hydroelectric.
Inside a data center, AI software can help mitigate AI power consumption by identifying and reducing inefficiencies, monitoring, analyzing, and optimizing electricity use in real time. New data centers can be designed with the help of AI to be highly power-efficient, while the emerging technology can be used to improve the efficiency of existing facilities.
Google said in a report from almost 10 years ago that it used its own DeepMind machine learning capabilities to reduce the amount of energy used to cool its data centers by 40%, and the AI of today—with generative AI, reasoning AI, and AI agents—is even more powerful.
AI can also help design data center systems and components like chips to make them more performant and efficient. Nvidia noted in a report last year that it was using LLMs to design semiconductors.
Sustainability at Aquent
At Aquent, power efficiency and sustainability are at the core of what we do. We’re proud to be carbon negative and are committed to erasing our historic carbon footprint. This includes reducing emissions by enabling remote work, cutting back on business travel, and—as mentioned here—moving the host of our recruiting software from a data center in Virginia to one in Oregon, powered by hydroelectric energy, saving about two tons of carbon emissions annually.
Aquent also invests in clean energy projects, and in 2024 and into this year began running our two largest solar projects, which, combined, can produce 8,724 MWh of power each year and eliminate 3,602 tons of carbon emissions, about a 5x increase in carbon offsets.
We’ve also created a digital Solar Twin that helps us visualize our emissions and carbon-reducing efforts. Overall, we’ve reduced our carbon emissions by 56% since 2020. Let us know how we can help you do the same.
Latest.

How to stand out to freelancers who care about more than pay.
Diversity, Equity & Inclusion, Leadership & Management, Retention Strategies, Talent Acquisition & Recruitment

Freelancing with benefits: What to know when your gig ends.
Career Advice, Diversity, Equity & Inclusion, Retention Strategies, Talent Acquisition & Recruitment

Charting a human-first future in the new age of AI design.
Insights from InsideOut, Leadership & Management, Content & Creative, Innovation & Emerging Tech