Schneider Electric's Steven Carlini on AI Workloads and the Future of Data Centers
Artificial intelligence is changing the data center industry faster than anyone anticipated. Every new wave of AI hardware pushes power, density, and cooling requirements to levels once thought impossible — and operators are scrambling to keep pace. In this episode of the Data Center Frontier Show, Schneider Electric’s Steven Carlini joins us to unpack what it really means to build infrastructure for the AI era.
Carlini explains how the conversation around density has shifted in just a year: “Last year, everyone was talking about the one-megawatt rack. Now densities are approaching 1.5 megawatts. It’s moving that fast, and the infrastructure has to keep up.” These rapid leaps in scale aren’t just about racks and GPUs. They represent a fundamental change in how data centers are designed, cooled, and powered.
The discussion dives into the new imperatives for AI-ready facilities:
Power planning that anticipates explosive growth in compute demand.
Liquid and hybrid cooling systems capable of handling extreme densities.
Modularity and prefabrication to shorten build times and adapt to shifting hardware generations.
Sustainability and responsible design that balance innovation with environmental impact.
Carlini emphasizes that operators can’t treat these as optional upgrades. Flexibility, efficiency, and sustainability are now prerequisites for competitiveness in the AI era.
Looking beyond hardware, Carlini highlights the diversity of AI workloads — from generative models to autonomous agents — that will drive future requirements. Each class of workload comes with different power and latency demands, and data center operators will need to build adaptable platforms to accommodate them.
At the Data Center Frontier Trends Summit last week, Carlini expanded further on these themes, offering insights into how the industry can harness AI “for good” — designing infrastructure that supports innovation while aligning with global sustainability goals. His message was clear: the choices operators make now will shape not just business outcomes, but the broader environmental and social impact of the AI revolution.
This episode offers listeners a rare inside look at the technical, operational, and strategic forces shaping tomorrow’s data centers. Whether it’s retrofitting legacy facilities, deploying modular edge sites, or planning new greenfield campuses, the challenge is the same: prepare for a future where compute density and power requirements continue to skyrocket.
If you want to understand how the world’s digital infrastructure is evolving to meet the demands of AI, this conversation with Steven Carlini is essential listening.