The connectivity demands presented by AI are increasing and evolving. Up to now, much AI traffic has been from hyperscalers training large language models (LLMs) in data centers. In this white paper, we worked with Heavy Reading to field a global survey of communication service providers (CSPs) to gain a better understanding of how AI-associated traffic will impact the metro and long-haul networks that they operate. Survey results provide clarity into the challenges they’re facing as they innovate to meet demand and a clear view of the opportunities presented by this transformative technological innovation. CSPs expect that AI inferencing will add incremental demand growth at data centers and net-new growth closer to the edge. They believe high-bandwidth wavelength services will become the dominant mode for AI connectivity as the market evolves from hyperscaler model training to model adoption by CSPs’ enterprise customers. Accordingly, CSPs must build business cases and prepare their networks for the upcoming boom in AI traffic.

%alt% Preview