According to a May 2024 outlook published by Goldman Sachs, AI implementations are now expected to force up to a 160% spike in data center power demand,1 demonstrating the increased urgency in managing this growth as the race for resources heats up. The IEA estimated that globally, data centers consumed 460 TWh of electricity in 2022, consuming about 2% of all generated power—and that number is expected to double by 2026.2 The reasons are clear; AI implementations require much greater compute power than other forms of processing, as power-hungry GPUs labor to meet growing demand.
In 2024, the need for more efficient strategies became clear. In 2025, we will see those strategies put into practice. Already there are some big moves and bold plans on the table, changes in data center builds that will power cloud compute to the next level.
The AI drivers—big compute goes small
The spread of AI’s applications into every facet of personal and professional life has been breathtaking. I could only compare it to the earliest days of the World Wide Web, our first introduction to the global internet in the late 1990s. At first a curiosity, alternately hyped and dismissed, the internet became integral to modern life in record time. It’s said that the telephone only became a common household fixture 50 years after its invention; the internet took about 20 years; AI looks poised to do the same in a fraction of that time, as it quickly finds new applications in the enterprise space, and the vast majority of this will be supported by data centers.
The number of inventive enterprise uses for AI is going parabolic—we have barely scratched the surface of AI’s impact on commerce, science and society itself. Ironically, the biggest innovation in decades is making its influence felt in ever-increasingly small ways through the enterprise space.
Building is booming
The biggest names in tech are building like never before, bending their 10-year CapEx averages ever higher as the gold rush-like race to AI compute gains steam. It’s not only the technology of AI that’s evolving, but also the delivery model; AI as a service is paving a smooth road for enterprise adoption of AI capabilities, particularly generative AI that can fill multiple roles from customer service to long-term financial planning. Indeed, data centers themselves are increasingly making use of GenAI to address the persistent lack of skilled IT employees by using AI to monitor, manage and support lean IT teams so they can be more productive. With an intuitive way to ask questions and receive recommendations, a less-advanced IT team can punch above its weight and relieve some of the labor stresses data centers face.
With these buildouts reliable access to sufficient power remains a challenge. Data centers draw a growing percentage of generated power worldwide and the trend will continue for the foreseeable future, accounting for as much as 44% of increased electrical demand through 2028, according to Bain & Company, shared in a recent report from Utility Dive.3 The scarcity of excess energy supply in most areas is driving new data center builds to new and sometimes unexpected locations to secure proximity to affordable power generation sources or leasing dedicated grid power to ensure supply. And we’ve all seen the stories of data centers’ recent embrace of dedicated nuclear power generation to support their growth.4 We expect to see even more of this in 2025 and beyond.
The choice of nuclear is a logical one; the source is stable, scalable and relatively sustainable compared to fossil fuel-driven sources. At the same time, data centers are doing what they can to reduce energy consumption—both as a matter of economics and environmental responsibility—by deploying water cooling systems in place of less-efficient forced air cooling. As the scale of GPU-powered AI compute rises, these efficiencies will become more apparent, as will the benefits of increased network uptime, as excessive heat is a prime culprit in outages and premature component failure.
Shrinking the profile of infrastructure
Related to both power and cooling needs, the data center’s fiber infrastructure continues to become denser in AI compute facilities. GPUs in AI arrays must be fully networked—every GPU must be able to talk to every other GPU—which increases complexity by an order of magnitude and complicates cooling. To overcome the bulk of the required fiber infrastructure, data centers will use highly dense fiber systems to make those countless connections, packing more fibers and connectors into the existing footprint to power their AI networks.
By forcing more compute resources into fewer racks, data centers can reduce energy use and simplify cooling needs as well. Plus, as hyperscale data centers migrate from 2x400G (aggregate 800G) to native 800G, this advance fiber infrastructure will provide some much-needed pathway capacity to accommodate the demand yet to come.
Multitenant data centers—standardization and flexibility
I’ve spent a lot of time looking at the largest hyperscale data centers and their licensed AI as a service model as they relate to enterprise, but there’s another important side of the business to consider in 2025, and that’s how MTDCs will forge a way forward for their enterprise customers. Whatever their vertical, enterprise needs are changing fast and MTDCs must remain flexible to accommodate their needs.
Here, too, a standardized approach to denser fiber infrastructure is key because it reduces IT staff demands and simplifies configuration changes. Several top manufacturers of fiber infrastructure are in the process of launching or improving simpler, more plug-and-play technologies to help all data centers, but particularly MTDCs, to flatten the required skill curve required to be as agile and responsive as possible, maintaining SLAs even with leaner IT teams.
2025 will be 2024—only more so
The fundamental changes coming to data centers in this dawn of the AI age will be truly remarkable. From location to scale, hyperscale and MTDCs alike will need to scale up their fiber capabilities while scaling down their fiber’s physical profile, adopt new cooling technologies, and take a fresh look at how they buy and use electrical power. Unfortunately, there is no end in sight to the ongoing shortage of top-skilled IT expertise, but AI itself is already demonstrating ways that it can help operators fill those gaps with GenAI-powered monitoring and management.
As AI continues to make inroads in the enterprise space, data centers will be called upon to supply the massive compute required to turn promise into practical business benefits. Like AI, data centers will innovate and adapt to meet changing needs and deliver the optimal solutions that this fast-growing industry needs.
This article was first published in Data Center Knowledge.
1Goldman Sachs Insights (May 2024) AI is poised to drive 160% increase in data center power demand. https://www.goldmansachs.com/insights/articles/AI-poised-to-drive-160-increase-in-power-demand
2International Energy Agency, (January 2024) Electricity 2024: Analysis and forecast to 2026. https://iea.blob.core.windows.net/assets/6b2fd954-2017-408e-bf08-952fdd62118a/Electricity2024-Analysisandforecastto2026.pdf
3Robert Walton (October 2024) AI, data center load could drive ‘extraordinary’ rise in US electricity bills: Bain analyst. Fiber Utility Dive. https://www.utilitydive.com/news/data-center-load-growth-us-electricity-bills-bain/730691/
4David Chernicoff (April 2023) Virginia Data Center Project Plans to Transition to Small Modular Reactors. Data Center Frontier. https://www.datacenterfrontier.com/data-center-site-selection/article/33003477/virginia-data-center-project-plans-to-transition-to-small-modular-reactors
© 2024 CommScope, LLC. All rights reserved. CommScope and the CommScope logo are registered trademarks of CommScope and/or its affiliates in the U.S. and other countries. For additional trademark information see https://www.commscope.com/trademarks. All product names, trademarks and registered trademarks are property of their respective owners.