Blog: The AC DC debate is back - and AI is the reason
By Dr Vincent Zeng
For more than a century, alternating current has been the undisputed foundation of our electricity system. The so-called “war of the currents” was settled decisively in AC’s favour, and with good reason. It scaled. It travelled long distances efficiently. It made modern power grids possible.
For most of that time, the debate felt closed.
Today, it is very much open again — and AI is the catalyst forcing us to reopen it.
To understand why, it is worth revisiting why AC won in the first place. Thomas Edison’s original vision for electricity was a low-voltage DC system, with generation close to the load. His Pearl Street Power Station supplied nearby buildings in Manhattan directly, powering lighting and motors through a localised network. In many ways, it was a distributed model long before the term existed.
What ultimately decided the outcome was not ideology but economics and engineering maturity. The emergence of reliable, low-cost transformers allowed AC voltage to be stepped up for long-distance transmission and stepped down safely at the point of use. This dramatically reduced losses and made continent-scale power networks viable. DC simply could not compete at the time, because efficient voltage transformation was not technically or commercially practical. AC became the logical, scalable choice, and the grid we rely on today was built around that assumption.
The key word there is assumption.
AI has changed the operating conditions those assumptions were built on.
Modern compute hardware runs natively on DC. GPUs, accelerators and memory systems all require DC internally, yet most data centres still distribute AC throughout the facility. That means repeated conversion stages — AC to DC, DC back to AC, then DC again — each adding loss, heat, complexity and cost. At modest power levels, this inefficiency was tolerable. At AI scale, it is not.
We are now seeing single racks pushing towards 200 kilowatts, and facilities trending towards gigawatt-class demand. At these densities, AC architectures begin to struggle. Transformers grow bulky. Cabling becomes heavy and space-hungry. Conversion losses compound. Heat generation increases, and cooling systems are forced to work harder just to remove energy that never needed to be lost in the first place.
This is the point at which AC stops scaling gracefully.
“The AI era is not just changing how we compute. It is forcing us to rethink how we deliver energy to compute — efficiently, reliably and at scale. ”
In parallel, the technological landscape has shifted in ways Edison could not have imagined, but might have appreciated. Advances in power electronics mean that DC voltage transformation is no longer difficult or inefficient. Solid-state converters can now perform many of the functions once exclusive to traditional transformers, with high efficiency and fine-grained control.
Generation has changed too. Renewables, batteries and energy storage systems are inherently DC technologies. Yet we continue to route them through AC–DC–AC conversion chains simply to interface with an AC grid. On the load side, almost everything — from LED lighting to battery chargers — already converts power locally. Data centres simply represent this trend at unprecedented scale.
When you aggregate enormous DC loads in one place, the economics shift. A DC distribution architecture simplifies power interfaces to IT equipment, removes unnecessary conversion stages, and enables higher power density with fewer and smaller passive components. It handles fast power swings more naturally — a critical capability for AI workloads — and allows tighter integration between compute, cooling and energy management.
These are not theoretical benefits. We are seeing growing alignment across AI platform providers, infrastructure designers and hyperscale operators that DC distribution is becoming unavoidable at the highest power densities. The conversation has moved from “if” to “when, where and how”.
There is also a persistent misconception worth addressing. Moving towards DC in data centres does not mean dismantling the AC grid or replaying a historical rivalry. This is not a zero-sum replacement. AC will remain essential for long-distance transmission and general distribution. DC networks will complement it, appearing where they make technical and economic sense.
In many ways, this represents a return to first principles rather than a radical departure. Distributed generation. Local optimisation. Power architectures designed around the load, not inherited constraints.
The AI era is not just changing how we compute. It is forcing us to rethink how we deliver energy to compute — efficiently, reliably and at scale. That is why a debate that once seemed settled is back on the table, and why DC is no longer a historical footnote, but a serious architectural contender for the future of AI infrastructure.