If we had a conversation at this time last year, just as OpenAI released ChatGPT into the wild, I might have said the collective fervor around generative AI was nearly as overhyped as the metaverse was in 2021. A year later, it’s become clear that AI technology is revolutionary, and is setting up a dramatic effect on data center construction and deployments, and on network architecture design in general.
The reality is our insatiable expansion of data consumption and appetite for cloud-based services, accelerated by the glitzy promises of generative AI, mean the need for robust and secure data storage, faster data transfer, higher compute intensity, and increased data center efficiency have never been more paramount. I’ve said it since enterprises first started to move to the cloud: If there's one thing we know for certain, it's that our dependence on data centers is only going to increase.
Here are four key trends I believe will continue to impact and influence data center operations as we head into 2024.
1. No surprise, generative AI will drive aggressive and expanded data center build-out
According to Synergy Research Group, analysts anticipate hyperscale data center footprints will grow threefold in the next six years to accommodate the demands of generative AI. One needs only consider the billions of dollars committed recently by several tech giants toward the cultivation of more advanced AI capabilities to understand how seriously the industry views AI’s potential.
Speed to market—how quickly hyperscale, enterprises and operators can get data centers up and running—will continue to be equal parts challenge and competitive advantage. Powered by enormous data increases and further intensified by the significantly higher compute of generative AI applications and workloads, organizations must broadly rethink how they plan, design and construct new facilities (or refurbish existing locations) to not only meet today’s increased demand but anticipate how that will look in the months and years ahead.
But here’s the catch...
2. Availability of labor and power will hinder data center construction
Going into 2023, a substantial concern for data center operations was supply chain delays, which led to a shortage of chips and other foundational products and raw materials. While those challenges have largely abated, they’ve now given way to labor shortages and power availability—with seemingly no light at the end of the fiber-optic cable, if you will.
Overall macro market conditions have driven central banks to raise interest rates globally, which has fundamentally changed the economics around data center construction and further spiked the costs of construction labor, which was already in short supply. While operational labor—those skilled jobs needed to manage and maintain data centers—also remains a challenge, it’s not nearly at the same urgency as the hundreds or even thousands of workers across dozens of trades required to get 100,000-square foot facilities from land and raw materials to ribbon cutting.
New AI deployments consume significantly more power per rack. This increases the challenge at existing data centers to accommodate AI clusters and raises the difficulty in identifying new locations that can support the increased power consumption. New power plants and the infrastructure to deliver and manage that power for these higher consuming data center increases the challenges in identifying new sites and gaining regulatory approval.
While increased productivity and efficiency are hallmarks of generative AI, it will likely have the opposite effect on the construction/labor and power front. As generative AI use and density increases, the labor and power needed to keep pace with the physical install and maintenance of essential components, such as new chips, servers, and cabling, will escalate considerably.
3. Sustainability will save or sink data center deployments
With heightened awareness in boardrooms of companies’ climate impacts, enterprises must recognize and implement highly efficient and sustainable practices across all three levels of the data center lifecycle—site location, construction, and operation—while avoiding material cost increases to day-to-day operations. Anything that draws a milliwatt of electricity or power must be optimized, from chips, servers, switches, and cooling systems to the vending machine in the break room. It’s a challenge increasingly aggravated by compute-heavy generative AI workloads.
Indeed, you may have come across recent stories of metro areas showing trepidation at the prospect of a new data center proposal, and all the electrical, space and water impacts that can accompany them. The current environment is one that values ecological and social responsibility as much as economic prosperity. While plenty of companies have invested resources to address their direct carbon footprint, we’ll continue to see increased scrutiny in 2024 on indirect footprints as well, from supply chain impacts all the way down to carbon declarations of individual products and materials purchased for data center operations.
That said, current macroeconomic headwinds continue to put enormous pressure on enterprise leaders to perform a delicate dance—implement sustainable practices and processes without increasing costs or, most importantly, impacting profitability. According to PricewaterhouseCoopers’ 2021 Global Investor ESG Survey, 81% of investors would accept no more than one percentage point less in investment returns for pursuit of ESG goals, with nearly half (49%) unwilling to accept any reduction in returns.
4. The global regulatory tug of war on data centers will increase
We’ll continue to witness a tug of war between lawmakers and corporations over two key data center sticking points—sustainability (the land and energy needed to operate) and data sovereignty (where and how data is stored).
Some jurisdictions have already enacted complete moratoriums or temporary pauses on new data center construction, while many others have imposed restrictions in places including London, Amsterdam, and Singapore where protests regularly make the news.
On one hand, latency demands and data sovereignty laws will force data centers into more localized areas. On the other, despite their necessary role to avoid data being shared and processed abroad, data centers are likely to face stiff resistance from elected officials and distrust from local communities Of course, an aggravating factor in this debate is and will continue to be AI, particularly as more enterprises endeavor to develop and deploy this technology into their operational processes while they maintain heightened awareness about the integrity of the data used to train their models.
What will undoubtedly challenge this space further are the inevitable AI regulations that continue to slowly build momentum in Washington, including U.S. President Biden’s recent executive order, the AI Act in the EU, and across other local, regional, national, and transnational jurisdictions. Right now, enterprises and data center operators remain firmly in a holding pattern as to how these proposed guardrails will shake out and what their impacts will be on AI exploration, development and deployment.
The silver bullet: efficiency is the only way forward
In the broader technology space, it’s rare that we stumble across a silver bullet, a perfect, bespoke solution, but for the myriad of hurdles data centers face in 2024, all pathways appear to lead to a single, intrinsic solution—efficiency.
On the labor front, more efficient data centers—from a design, power usage, and power density perspective—mean fewer are needed, thus resulting in less construction and maintenance burden. More efficiently designed “plug-and-play" infrastructure products and architectures are easier to install, which translates to less time spent—and less reliance on highly skilled labor.
From a sustainability standpoint, more efficient data centers that leverage a combination of renewable energy sources—wind, solar, geothermal, hydro/tidal, and even safe, reliable nuclear power—mean less reliance on potentially scarce local power sources. AI and machine learning advancements further advance this efficiency by seamlessly pinpointing and addressing power-intensive compute hotspots. Additionally, we’ll see a shift to cradle-to-cradle rather than traditional cradle-to-grave product lifecycles to limit waste, where data center components that are regularly upgraded, such as servers and switches, will filter into a robust reuse/recycle market rather than going direct to landfills.
Finally, within the vast regulatory ecosystem, data center efficiency—characterized by minimized physical, carbon, and energy footprints—plus transparent data sovereignty compliance can help reduce social and political resistance to data center deployments. Physical efficiency, leveraging existing multi-tenant or co-location brick and mortar infrastructure, can reduce overhead and alleviate anxiety for local officials and communities who may oppose watching heavy machinery break ground on vast new standalone complexes.
As competing technology innovations, labor, sustainability, macroeconomic conditions, and regulatory pressures put the squeeze on hyperscale, enterprises, and operators, an unrelenting global appetite must be balanced by an intransigent commitment to efficiency—throughout the location/discovery, construction/design, and operations/management phases—to power data center growth and expansion in 2024.
© 2024 CommScope, Inc. All rights reserved. CommScope and the CommScope logo are registered trademarks of CommScope and/or its affiliates in the U.S. and other countries. For additional trademark information see https://www.commscope.com/trademarks. All product names, trademarks and registered trademarks are property of their respective owners.
This article was first published in Data Center Knowledge.