NATHAN TRACY OPTICAL STANDARDS
800G (with 1.6Tb in process) in transceivers enabling the direct integration of coherent optics into switches and routers. These interfaces help AI providers interconnect distributed compute clusters and storage resources with predictable performance. CEI - HIGH-SPEED ELECTRICAL INTERFACES FOR AI SYSTEMS Electrical interfaces remain foundational inside AI systems — connecting switching ASICs, accelerators, retimers, and optical modules across boards, packages, and short-reach interconnects. As AI workloads push aggregate bandwidth into the multi- terabit range, the performance of these electrical links becomes a determining factor for system scalability, power efficiency, and reliability. The Common Electrical Interface (CEI) specifications developed by OIF define standardised, high-speed electrical signalling parameters that enable interoperable communication between components, independent of vendor implementation. CEI IA’s have been the cornerstone of high-speed networking, enabling the evolution of data rates from 6Gbps to 224Gbps, and now laying the groundwork for 448Gbps standards. To enable 448G electrical specifications, IAs will consider new higher performance twinax media and interconnects such as cabled host co-packaged copper (CPC) interconnects mating directly onto switch and accelerator packages. For AI infrastructure, where switches and accelerators must exchange massive volumes of data with higher density and extremely low latency, CEI plays a crucial role. High-speed SerDes links built to CEI specifications allow designers to scale bandwidth predictably while maintaining acceptable bit error rates and thermal envelopes. CEI ensures that the electrical foundation of AI systems can keep pace with the rapid expansion of compute and networking demands. EEI - ENERGY EFFICIENT INTERFACES FOR SUSTAINABLE AI GROWTH As AI infrastructure scales, power consumption has emerged as one of the most significant constraints on system growth. Training large models and operating hyperscale AI clusters require enormous energy resources, making efficiency at every layer of the infrastructure critical. Recognising this challenge, OIF has focused on Energy Efficient Interfaces (EEI) to address the growing need for reduced power per bit without sacrificing performance. EEI refers to interface approaches and specifications that prioritise energy efficiency as a first-class design objective,
particularly for short-reach and high-density interconnects used in AI systems. Rather than focusing solely on raw bandwidth. In AI environments, where thousands of links operate simultaneously, even small efficiency improvements at the interface level can translate into substantial reductions in total power draw and cooling requirements. OIF’s work around EEI helps the industry converge on common expectations for how energy efficiency is measured, achieved, and balanced against performance. This shared framework enables vendors to innovate while still maintaining interoperability and predictable system behaviour. Key OIF EEI project milestones include establishing a framework to define user- driven, energy-efficient requirements for AI. The 112G Retimed Transmitter Linear Receiver (RTLR) IA balances the power benefits of Linear Pluggable Optics (LPO) with better signal integrity. By addressing energy efficiency at the interface level, EEI supports the long-term viability of AI infrastructure, ensuring that performance gains remain economically and environmentally sustainable. CO-PACKAGING - BRINGING OPTICS CLOSE TO SILICON Co-packaged optics is emerging as a key innovation to overcome bandwidth and power limitations of traditional modular pluggable optics. Instead of having optical modules connected via cables or pluggable cages on switch boards, optical transceivers are tightly integrated with the switching ASIC’s die or package. This drastically shortens electrical paths, reducing power, improving latency, and enabling higher aggregate bandwidth. OIF has been at the forefront of defining test criteria, architectural options, and interface points for co-packaged optics. This work helps vendors avoid reinventing basic interconnect layers and enables industry alignment on how such systems should communicate. OIF’s initial Co- Packaging framework established the need for interoperability, defined key application spaces, and identified a need for External Laser Sources (ELS) to address thermal issues. Subsequent work resulted in IA’s for 3.2Tbps CPO and CPC optical and electrical co-packaged modules and External Laser Small Form Factor Pluggable (ELSFP). By creating a shared foundation for co-packaged optics, OIF accelerates ecosystem readiness and broader adoption in AI data centres. CMIS - MANAGING COMPLEX, HIGH- SPEED OPTICAL MODULES As optical modules become more sophisticated — supporting advanced diagnostics, multi-mode operation, and tunable parameters — there is a growing
need for a standardised management interface that systems can use to configure and monitor devices. The Common Management Interface Specification (CMIS) defines a common language for managing complex, high- speed optical modules, enabling greater interoperability, diagnostics, and firmware updates. OIF has evolved CMIS into a family of specs for diverse applications, including Coherent-CMIS (C-CMIS). Without CMIS, each vendor might expose its own management interface, leading to complexity in management software, higher integration costs, and operational risk. CMIS, together with physical interface standards, ensures AI infrastructure maintains high uptime with predictable performance. STANDARDS ARE THE BACKBONE OF THE AI INFRASTRUCTURE As AI reshapes industries and society, the infrastructure that supports it must be reliable, scalable, and interoperable. Industry standards are pivotal to achieving this vision. From coherent optics that deliver high- capacity links, to CEI and EEI interfaces that enable high-speed signalling, to co-packaged optics that blur the lines between compute and networking, and CMIS that brings consistent management across devices, standards ensure that the AI ecosystem functions as a cohesive whole. Without these shared specifications, the AI revolution could devolve into a patchwork of proprietary approaches, stifling innovation and elevating cost. Instead, by leaning into industry collaboration, open specifications, and interoperability-focused engineering, the AI infrastructure stack can continue its rapid growth, driven not by silos, but by shared standards that benefit the entire ecosystem.
Nathan Tracy, Technologist, System Architecture Team at TE Connectivity; OIF President
www.opticalconnectionsnews.com
15
ISSUE 43 | Q1 2026
Made with FlippingBook flipbook maker