Research shared by the UK data centre operator Pulsant shows that organisations running artificial intelligence systems are facing mounting pressure over where those systems are hosted. Banks, hospitals and manufacturers all report higher costs and heavier infrastructure demands as AI use grows.
Analysts quoted by Pulsant forecast that global data centre demand could reach between 171 and 219 gigawatts by 2030, compared with about 60 gigawatts today. AI ready facilities could account for around 70% of that demand, according to the same analyst estimates.
AI systems use far more electricity than everyday business software. Training models requires dedicated graphics processors running continuously for weeks. Large data sets move constantly across networks, raising energy use and cooling needs.
Even after training ends, live AI systems rely on consistent processing speed. Delays can affect fraud checks, medical analysis or production forecasts. These pressures have made infrastructure decisions a board level issue rather than a technical detail.
Is The Public Cloud Losing Its Appeal For Long Term AI Use?
Public cloud platforms remain popular at the start of AI projects. Teams can access powerful processors quickly and test different model designs without buying equipment. This suits trials and short term work.
There begins to be problems once AI systems are having to run day and night. Renting advanced processors for long periods drives spending sharply higher. Data leaving cloud platforms also triggers egress charges, which inflate monthly bills. Pulsant notes that finance teams often react once these costs come into view.
Capacity has also become an issue because demand for high end processors has at times exceeded supply on shared cloud platforms. This has left organisations waiting for resources during busy periods.
Banks, insurers and healthcare groups handle sensitive records. Cloud platforms meet strong digital security standards, but data location and audit control sit outside the customer鈥檚 direct control, which unsettles compliance teams.
More from Artificial Intelligence
- Taiwan’s TSMC Profits Set To Surpass 50% Thanks To AI Chip Demand
- Google And Intel Deepen AI Chip Ties, Indicating That AI Isn’t Just About GPUs Anymore
- The ICO Just Weighed In On AI Agents And Data Protection, Here Is What UK Startups Need To Know
- Sam Altman鈥檚 Robot Tax Plans: What Does It Actually Mean And Who Would It Affect?
- In The AI Age, Do You Still Need To Spend Money On Expensive Phone Cameras?
- Meet Muse Spark, Meta’s AI That Knows You Better Than You Know Yourself
- Mallory Launches AI-Native Threat Intelligence Platform, Turning Global Threat Data Into Prioritised Action
- How Is AI Being Used In Dentistry?
What Makes Colocation More Attractive For AI Systems?
Colocation allows organisations to install their own hardware in specialist data centres built for heavy power draw and advanced cooling. These facilities keep dense processor clusters running around the clock.
Pulsant cited survey findings that explain this move where IT leaders ranked high density power and cooling as the top requirement for hosting AI workloads at 54%. Direct links to public cloud platforms followed at 51%. Support for high performance computing infrastructure came next at 49%.
This set up allows demanding training work to run on stable equipment while retaining cloud access when extra capacity is needed. Connectivity between private systems and public platforms plays a big role in that model.
Cost control also differs because cloud hosting avoids upfront purchases, but long running AI workloads create unpredictable monthly charges. Colocation requires early investment, but operating costs stay steady, which finance teams can map more easily.
Stephen Spittal, Technology Director at Pulsant, says: 鈥淎I puts far more strain on infrastructure than traditional IT. Once you move past the pilot stage, the demand for power, cooling, and connectivity is constant.
鈥淐olocation gives organisations the capacity to run those workloads without interruption in sites specifically designed to be efficient – and the confidence that performance will hold up as projects grow.
鈥淎I is moving to the Edge – inference needs to move closer to consumers. Our latest research indicates 87% of UK businesses plan to migrate partially or fully from public cloud in the next two years.鈥