By forcing more compute resources into fewer racks, data centers can reduce power consumption and simplify cooling needs as well.
Koen ter Linde*
What a difference there is between one year and another. Last year we noted that the exponential growth in demand for AI computing in data centers would force more efficient processes, faster builds, and more creative problem-solving to deal with the persistent shortage of top IT talent, and this has certainly proven to be true, in fact, truer than anyone really expected.
According to a May 2024 outlook published by Goldman Sachs, AI deployments are expected to lead to up to a 160% increase in data center energy demand, demonstrating the growing urgency to manage this growth as the race for resources accelerates.
The IEA estimated that, in 2022, data centers consumed 460 TWh of electricity worldwide, which is about 2% of all power generated, and this figure is expected to double by 2026.2 The reasons are clear: AI implementations require much more computing power than other forms of processing, as power-hungry GPUs work to meet the growing demand.
In 2024, the need for more efficient strategies became apparent. In 2025, we will see how those strategies are put into practice. There are already big moves and ambitious plans on the table, changes in data center construction that will take cloud computing to the next level.
The drivers of AI: Big computing gets small
The spread of AI applications in all facets of personal and professional life has been impressive. I could only compare it to the early days of the World Wide Web, our first introduction to the global Internet in the late 1990s. At first it was a curiosity, given great fanfare, but in record time it became an integral part of modern life. The phone is said to have not become a common household accessory until 50 years after its invention; The Internet took about 20 years; AI seems poised to do the same in a fraction of that time, as it quickly finds new applications in the enterprise space, and the vast majority of them will be supported by data centers.
The number of innovative business uses of AI is skyrocketing – we've barely scratched the surface of AI's impact on commerce, science, and society itself. Ironically, the biggest innovation in decades is making its influence felt in smaller and smaller ways across the business space; for example, in the construction sector.
Construction is booming
Big names in tech are building like never before, increasingly raising their 10-year capital expenditure averages, as the race to AI computing, similar to the gold rush, gains momentum. It's not just AI technology that's evolving, but also the delivery model: AI-as-a-Service is paving the way for businesses to embrace AI capabilities, particularly generative AI that can perform multiple functions, from customer service to long-term financial planning.
In fact, data centers themselves are increasingly turning to GenAI to address the persistent lack of skilled IT employees, using AI to monitor, manage, and support lean IT teams so they can be more productive. With an intuitive way to ask questions and receive recommendations, a less advanced IT team can perform above and beyond its means and alleviate some of the work stresses that data centers face.
With these constructions, safe access to enough energy remains a challenge. Data centers consume an increasing percentage of the energy generated worldwide and the trend will continue for the foreseeable future, accounting for up to 44% of the increase in electricity demand through 2028, according to Bain & Company in a recent Utility Dive report.
The shortage of excess power supply in most areas is driving the construction of new data centers in new and sometimes unexpected locations to ensure proximity to affordable power generation sources or the leasing of dedicated grid power to guarantee supply. And we've all seen the stories of data centers that have recently adopted dedicated nuclear power generation to support their growth. We hope to see even more of this in 2025 and beyond.
The choice of nuclear energy is logical: it is stable, scalable, and relatively sustainable compared to fossil fuel-based sources. At the same time, data centers are doing everything they can to reduce energy consumption – both as a matter of economic and environmental responsibility – by installing water cooling systems instead of less efficient forced air. As GPU-powered AI computing scales up, these efficiencies will become more apparent, as will the benefits of increased network uptime, as excessive heat is a major culprit for outages and premature component failure.
Reducing infrastructure profile
In relation to power and cooling needs, the data center's fiber infrastructure continues to become denser in AI computing facilities. The GPUs in AI arrays must be fully networked (each GPU must be able to communicate with all the others), which increases complexity by an order of magnitude and complicates cooling. To overcome the volume of fiber infrastructure needed, data centers will use high-density fiber systems to make those myriad connections, incorporating more fibers and connectors into the existing space to power their AI networks.
By forcing more compute resources into fewer racks, data centers can reduce power consumption and simplify cooling needs as well. In addition, as hyperscale data centers migrate from 2x400G (800G aggregated) to native 800G, this advanced fiber infrastructure will provide much-needed transit capacity to accommodate the demand that is yet to come.
Multi-tenant data centers: standardization and flexibility
I've spent a lot of time looking at the largest hyperscale data centers and their AI-as-a-Service model as it pertains to enterprises, but there's another important aspect of the business that needs to be considered in 2025, and that's how MTDCs will forge a path forward for their enterprise customers. Whatever their vertical, business needs are changing rapidly and MTDCs must remain flexible to adapt to their needs.
A standardized approach to denser fiber infrastructure is key here as well, as it reduces IT staff demands and simplifies configuration changes. Several major fiber infrastructure manufacturers are launching or improving simpler technologies, more plug-and-play technologies to help all data centers, but especially MTDCs, reduce the skills curve needed to be as agile and responsive as possible, while maintaining SLAs even with smaller IT teams.
2025 will be 2024, only even better
The fundamental changes that are coming for data centers at the dawn of the AI era will be truly remarkable. From location to scale, both hyperscale and MTDC data centers will need to increase their fiber capacities while reducing their physical profile, adopt new cooling technologies, and rethink how they purchase and use electrical power. Unfortunately, there's no end in sight to the current shortage of highly skilled IT experts, but AI itself is already demonstrating how it can help operators fill those gaps with GenAI-powered monitoring and management.
As AI continues to make its way into the enterprise space, data centers will be called upon to supply the massive computing needed to turn promises into practical business benefits. Like AI, data centers will innovate and adapt to meet changing needs and deliver the optimal solutions needed by this rapidly growing sector.
Koen ter Linde is SVP & president, Connectivity and Cable Solutions at CommScope.