Australia and New Zealand’s data centre market is entering a more operational phase of AI deployment, as organisations shift their focus from experimentation to production-scale use, according to Equinix Australia managing director Guy Danskine. He said Australia is seeing “an unprecedented level of interest in AI”, driven by both enterprise demand and government policy, but that the infrastructure conversation is becoming more pragmatic.
The Australian government’s National AI Plan, he said, has reinforced the role of data centres “as the backbone of AI infrastructure”, particularly as performance, scalability and security requirements intensify.
While much of the early attention has been on the computational demands of training large models, Danskine argued that the next phase will be defined by inference. “Much of the AI conversation to date has focused on the enormous computational demands of training huge models,” he said. “That focus is now shifting.” As organisations look to extract tangible value from AI investments, inference – running models in real time – is becoming the priority.
This shift places different demands on infrastructure. Danskine said inference workloads require “low latency, proximity to users and secure access to data”, which in turn is influencing where capacity is built and how facilities are interconnected. He pointed to recent moves to deploy inference platforms closer to customers as an early indication of this trend, adding that “inference-optimised infrastructure will be central to scaling the capability of AI” by 2026.
Sovereignty more important
Data sovereignty is another area where Danskine expects more strategic decision-making. Rather than being treated purely as a compliance issue, he said sovereignty has “moved beyond compliance to become a strategic priority”, as data increasingly underpins business value. Organisations are under pressure to maintain control over where data is stored and processed, even as they continue to operate across borders.
“The emerging challenge is balancing local data control with global insight,” Danskine said. He expects hybrid architectures to become the dominant approach, allowing sensitive data to remain onshore while still enabling collaboration and learning internationally. Done well, he said, this model can “turn sovereignty from a constraint into a competitive advantage”.
Agentic AI rises
Looking further ahead, Danskine highlighted agentic AI as a potential inflection point for networks and interconnection. Unlike current AI deployments, agentic systems are designed to act autonomously and coordinate with other AI agents. “These multi-agent systems will place unprecedented strain on networks,” he said, due to the need for continuous, real-time communication.
To support this, organisations will need “highly interconnected, distributed environments capable of supporting constant data exchange”. In this context, Danskine argued that network resilience will become “as important as compute power itself”, particularly as agentic systems move from research into commercial use.
Cooling conundrum
Thermal management is also emerging as a defining constraint. As AI workloads drive higher rack densities, Danskine said the industry is approaching the limits of air cooling. “The high thermal output of modern GPUs and processors is pushing air-cooling approaches to their practical limits,” he said, predicting that “by 2026, liquid cooling will become the de facto standard for new high-density AI deployments”.
Beyond performance, he said liquid cooling offers efficiency and sustainability benefits, enabling greater compute density while reducing energy consumption – factors that are becoming increasingly important as operators scale AI infrastructure.
Finally, Danskine expects the dominance of general-purpose large language models to erode in favour of more specialised systems. “The era of one-size-fits-all large language models is giving way to more specialised approaches,” he said, with organisations increasingly deploying “verticalised” models tailored to specific industries or use cases.
These models, he added, require flexible infrastructure that can support both training and inference, often close to end users and proprietary data. As specialised AI becomes mainstream, Danskine said infrastructure choices will play “a decisive role in determining how effectively organisations can turn models into measurable business outcomes”.