AI shifts data centre focus from scale to connectivity
The IOWN Global Forum and DE-CIX have outlined distinct but complementary views on how artificial intelligence is changing the role and design of data centres. Both point to a shift away from a narrow focus on hyperscale build-outs.
The comments come as operators face rising demand for AI workloads and growing scrutiny over energy use and land consumption. Industry groups and infrastructure providers are exploring ways to distribute computing across smaller sites, connect them to high-speed networks, and tie them to renewable power.
Dr Masahisa Kawashima, IOWN Technology Director at NTT and chair of the Technology Working Group at the IOWN Global Forum, said the prevailing growth model for data centres is reaching structural limits. The Forum brings together more than 180 organisations, including global cloud, chip and consulting firms, to develop photonics-based network infrastructure and next-generation network designs.
He said the current build-out around very large campuses has delivered rapid gains in compute for AI model training, but warned that the same approach cannot scale indefinitely amid power constraints, planning pressures and geopolitical risk.
"With the emergence of AI, the industry has increasingly embraced a "bigger is better" philosophy, racing to build ever-larger data centres in pursuit of scale. While this size-driven approach has delivered rapid gains in computational capacity, it is unlikely to be sustainable indefinitely. Physical constraints, power availability, environmental impact, and geopolitical considerations will inevitably impose limits on how far this model can scale," said Dr Kawashima.
As an alternative, Kawashima proposed a mesh of smaller facilities matched to local energy profiles and aggregated through high-bandwidth, low-latency links. In that model, the key asset is the network architecture rather than the size of any one site.
He argued that the sector must move towards distributed designs that treat many sites as a single logical pool of compute. Operators would link these facilities to renewable generation in each region and orchestrate workloads across them through optical and other advanced network technologies.
That shift could reshape investment decisions for both cloud providers and enterprises running large AI systems. The emphasis would move from adding racks inside single megacampuses to stitching together multiple mid-sized data centres.
"Looking ahead, the next phase of data centre evolution will be driven not by sheer scale, but by connectivity. The industry can no longer rely solely on a small number of hyperscale facilities to meet growing AI demands. Instead, we should design a distributed architecture composed of many data centres sized to operate efficiently on renewable energy. By interconnecting these sites through high-bandwidth, low-latency networks, we can aggregate their computing resources into a unified computing fabric," said Kawashima.
"This shift from scale supremacy to collective capability will be essential for building scalable, resilient, and energy-efficient AI infrastructure for the long term," he said.
His comments reflect growing concern among city authorities and grid operators about the cumulative impact of very large campuses on land use and high-voltage power grids. Several markets in North America and Europe have introduced moratoriums or tougher approval processes for new hyperscale projects in dense urban areas.
At the same time, telecoms operators and internet exchange providers are reporting new traffic patterns and latency requirements driven by AI inference. Those workloads often sit closer to end users or devices than the centralised training clusters in the largest data centres.
DE-CIX chief executive Ivo Ivanov said the AI wave is changing not just the scale of infrastructure but also its geography. He drew a sharp distinction between training, which favours concentrated compute, and inference, which often sits at the network edge.
Emerging AI applications in sectors such as transport, finance and industrial automation are shaping where data centre investment flows, he said. He argued that proximity and connectivity now matter as much as raw compute power when services must be delivered in near real time.
"AI was a processing challenge, now it's a geography challenge - and it's redrawing the data center map," said Ivanov.
"The AI boom used to be about building bigger, better data centers. That works for model training, where raw compute power is prioritised over connectivity and latency, but it doesn't work for inference, which is where the real value of AI is realised at edge deployments. AI inference demands near real-time responsiveness and, as such, it can't tolerate the latency associated with long round-trip delays to remote data center hubs."
"From driverless vehicles to real-time fraud detection, the deployment of AI in edge locations demands a far more distributed infrastructure environment than what we're currently used to, because while we may be able to tolerate a delay while streaming on Netflix, we can't afford the same delay when we're relying on in-vehicle traffic monitoring to keep us safe on the roads," said Ivanov.
He said the main pressure points are now in the network fabric, not only in server farms. AI systems rely on data moving constantly between locations, and performance degrades quickly when that movement slows.
"This is changing where the real pressure points sit when it comes to networking. AI depends on data moving constantly between locations, and if that movement slows down, performance drops off quickly, no matter how much compute you add into the equation. This is now the main driver of the data center boom we're seeing in the US. It's no longer about building bigger or better data centers, but about where they're deployed and how they're connected."
"In other words, geography is becoming just as important as power and compute. In the coming years, that's going to stretch the definition of data centers - from hyperconnected edge deployments and emerging AI-focused data center hubs, to more experimental concepts that sit above the clouds in the Earth's orbit. The data center map is being redrawn in real time," said Ivanov.
Taken together, the two views point to a more fragmented but tightly networked infrastructure layer for AI. For operators and investors, the focus is shifting from headline megawatt counts at single sites to the topology, latency and energy mix of interconnected clusters across regions.