ICT Today Jan/Feb/Mar 2026

consuming more power. It is about infrastructure fragmenting across thousands of edge sites while power generation simultaneously shifts to DC-native behind-the-meter sources. The old AC architecture was optimized for a world of centralized facilities with predictable loads. That world is now disappearing faster than the industry can adapt.

FIGURE 1 : Electricity meter showing usage in kWh.

THE PHYSICS PROBLEM Traditional data center racks drew 5 to 10 kW, but then AI changed everything. Modern AI racks gulp 50 to 100 kW, with some pushing past 200 kW. 4 At 48 V DC, the standard that has been around for decades, a 100 kW rack would demand over 2000 amps of current. Copper bus bars at that scale become physically impractical. Enter the move toward 380 V and 800 V DC for backbone AI rack power. Meanwhile, edge facilities are popping up in office closets, cell tower base stations, and repurposed retail spaces. They need safe, scalable power without the complexity of conduit-heavy AC installations. And what about behind-the-meter solar and battery systems? They are DC-native up to 1500 V for solar arrays. Yet in traditional data centers, this DC gets converted to AC for distribution, then converted back to DC at every server power supply. It is an efficiency nightmare hiding in plain sight. For facilities investing millions in behind-the-meter generation to escape three-to-seven-year grid upgrade timelines, accepting this waste feels increasingly illogical. Organizations now connect with 30 percent more business partners in twice as many locations, according to Equinix's Global Interconnection Index. 5 Each of these edge sites needs power infrastructure that can deploy quickly, operate safely in space- constrained environments, and scale incrementally. Traditional AC distribution—with transformers, UPS systems, and circuit breakers—consumes precious space and demands expertise that may not exist in remote locations. The physics problem is not limited to just AI racks

Reimagining Power at the Edge: Integrating Low- Voltage and Fault-Managed DC Power Architectures in AI-Driven Data Centers By Bolis Ibrahim, Zenon Radewych, Anjanaa Santhanam THE UNDERDOG STORY

FIGURE 2 : Meter reading for DC electrical current.

from $12.36 billion in 2024 to $109.91 billion by 2033, growing at 28.9 percent annually. 2 More than 50 percent of data is now generated outside traditional centralized data centers. 3 These edge sites range from micro data centers with just a few racks in urban office buildings and cell towers, to regional facilities serving wider geographic areas. Here is the twist: edge facilities and 60 V solar and battery systems speak DC natively while every AC conversion step uses energy—with 10 to 15 percent used before it reaches silicon. The industry is deploying distributed infrastructure at breakneck speed while simultaneously accepting massive efficiency penalties because the alternative requires rethinking power distribution. A quiet revolution is happening now, not through marketing campaigns or vendor hype, but through the maturation of Class 2 low voltage, Class 4 fault- managed power, and other emerging 380V and 800V DC standards that solve the safety and compatibility barriers that stalled in data centers for years.

DC power has been the perennial underdog in data center power distribution. Yet it should have been the power current of choice based on physics alone: fewer conversions, less heat, and higher efficiency, but for decades, AC power has remained the entrenched champion. Why? Because a change in power architecture would require rewriting the equipment ecosystem and rewiring standards. Efficiency alone has been insufficient to overcome institutional inertia. Currently, AI workloads are rewriting the rules of power density, and edge computing is scattering infrastructure across thousands of micro-sites. Suddenly, the traditional AC playbook is beginning to look like a rotary phone in a 5G world. Edge data centers are smaller, decentralized facilities strategically located closer to end users and are proliferating to support low-latency applications from autonomous vehicles to real-time analytics. 1 The global edge data center market is projected to surge

WHY DC HAS NOT WON (YET) The paradox is simple: DC is more efficient, but adoption is stalled because of interconnected barriers that no single player could overcome alone. Equipment compatibility : In a proverbial chicken-and-egg problem, operators will not deploy DC without DC-ready servers, and vendors will not build DC servers without customer demand. Every data center that deployed traditional AC reinforced the AC ecosystem, making the transition a significant challenge. Safety perceptions and training : Safety perceptions have traditionally created a barrier. Many electrical engineers and contractors were initially wary of high-voltage DC fault behavior. Unlike AC, which naturally extinguishes arcs by crossing zero volts 120 times per second (at 60 Hz), DC maintains

I

I

44

ICT TODAY

January/February/March 2026

45

Made with FlippingBook - Online catalogs