SEOUL, Jan. 14 (Korea Bizwire) — As artificial intelligence drives an unprecedented surge in computing demand, global data center operators are racing to redesign the infrastructure that powers the digital economy, with electricity, cooling and automation emerging as the defining challenges of the next phase.
In its annual “Vertiv Frontiers” report released this week, Vertiv said the rapid expansion of AI and high-performance computing is pushing data centers into the gigawatt era, fundamentally reshaping how facilities are designed and operated.
Rising power density from AI workloads is straining conventional power systems that rely on a mix of alternating current and direct current distribution, prompting growing interest in high-voltage direct current architectures, which reduce energy loss and allow for more compact conductors.
The report also points to the accelerating shift toward distributed AI. While early investment focused on large, centralized facilities for general-purpose AI services, AI inference is increasingly expected to be tailored to individual enterprises.
That trend is driving demand for on-premises and hybrid AI environments, particularly in heavily regulated industries, and elevating the importance of flexible, high-density power systems and liquid-cooling technologies.
As AI models and graphics processing units become more complex, speed has also become critical. Vertiv highlighted the growing role of digital twin–based design, which allows operators to simulate and optimize data center layouts before construction, significantly shortening deployment timelines.
A similar outlook was offered by Hewlett-Packard Enterprise, which said data centers are on the cusp of becoming largely self-managing systems. As AI is embedded across operations, HPE predicts facilities will evolve into closed-loop environments capable of predicting failures and automatically optimizing performance.
HPE also expects the traditional model of centralized cloud facilities to give way to networks of micro data centers resembling small hyperscale campuses.
These sites would combine AI inference accelerators with autonomous management tools, bringing computing closer to where data is generated. In parallel, cybersecurity is set to become a built-in function rather than an add-on, with distributed AI engines continuously monitoring and responding to threats.
Together, the forecasts suggest that the data center of the AI era will be less a static warehouse of servers and more a dynamic, power-intensive, and highly intelligent system—one designed to scale quickly as demand for AI services continues to climb.
Kevin Lee (kevinlee@koreabizwire.com)







