AI data centers are burning through the grid. Photonics stops it.
By 2030, global data centers will consume 980 terawatt-hours of electricity annually (Gartner). That's more than Japan's total electricity consumption. AI-optimized servers alone will account for 432 TWh — a 4.6× increase from 2025. In the United States, data centers could consume 6.7% to 12% of total national electricity output.
This is not a forecasting exercise. New nuclear power plants are being commissioned explicitly to power data centers. Tech companies are buying electricity futures at unprecedented scale. Google, Microsoft, and Amazon are each planning to consume more power than many small countries. The bottleneck is no longer compute. The bottleneck is watts.
The electron is the problem
Every electron that moves through a wire generates waste heat through resistive dissipation. Every transistor that switches generates heat through capacitive charging. At the 5nm node, thermal density exceeds 100 watts per square centimeter. An NVIDIA H100 GPU consumes 700W. A rack of eight consumes 5.6kW — more than most American households.
Cooling infrastructure already accounts for 40% of a data center's total power budget. As AI models grow — GPT-5, GPT-6, multimodal reasoning engines — the power requirements grow super-linearly. The industry is building the world's most expensive hairdryers.
Photons don't generate heat
This is not a metaphor. It is a physical fact. Photons traveling through a waveguide do not generate resistive heat. They do not require charge movement. They compute — particularly matrix-vector multiplication and Fourier transforms — through interference, which is a natural property of wave propagation. No energy is dissipated in the computation itself.
Photonic systems consume 50–80% less power than equivalent electronic systems. Co-packaged optics reduce system power by 25–30% versus pluggable transceivers. Direct chip-to-chip optical communication eliminates the SerDes (serializer/deserializer) circuits that consume up to 50% of total ASIC power. And photonic interconnects provide 100× the bandwidth density of copper.
ODR makes photonic AI production-grade
The reason photonic AI accelerators haven't reached production isn't power — it's accuracy. As photonic neural networks scale, phase errors accumulate and destroy signal fidelity. Published research documents up to 84% accuracy loss in deep photonic networks. ODR solves this by continuously restoring phase coherence between processing layers, making deep photonic networks viable for the first time.
The math is simple: if photonic computing reduces data center energy consumption by 50%, and data centers consume 980 TWh by 2030, that's 490 TWh saved annually — worth $50–100B in electricity costs at current industrial rates. QLT's ODR processor isn't just a compute improvement. It's a sustainability technology.
The world can't build nuclear power plants fast enough to feed AI's appetite. But it can replace the electron with a photon.