Nexthop AI raises $500M and launches new AI networking switches
Photo by Compare Fibre on Unsplash
Before today, AI data centers relied on generic switches; now Nexthop AI unveils purpose‑built AI networking gear after closing a $500 M Series B, SiliconANGLE reports.
Key Facts
- •Key company: Nex
Nexthop AI’s $500 million Series B, led by Lightspeed Venture Partners with participation from Andreessen Horowitz, Altimeter and the company’s existing backers, marks the largest single infusion into a pure‑play AI networking vendor to date, according to SiliconANGLE. The round pushes the startup’s post‑money valuation into the “high‑hundreds of millions” range, though the exact figure was not disclosed. The capital will fund the ramp‑up of Nexthop’s first purpose‑built AI networking switches, hardware designed to handle the extreme bandwidth and low‑latency requirements of modern AI workloads that traditional data‑center switches struggle to meet.
The newly announced switches integrate a custom ASIC optimized for tensor‑style traffic patterns, enabling direct, high‑throughput paths between GPU clusters and storage arrays. By offloading packet scheduling and flow control to the ASIC, the switches reduce tail latency by an estimated 30 percent compared to commodity Ethernet solutions—a claim Nexthop cites in its product brief. The hardware also supports programmable telemetry pipelines, allowing operators to monitor per‑tensor flow metrics in real time without sacrificing packet forwarding performance.
Beyond the silicon, Nexthop is delivering a software stack that abstracts the underlying network topology into a unified fabric API. This API lets AI frameworks such as PyTorch and TensorFlow request bandwidth guarantees and priority lanes directly from the scheduler, eliminating the need for manual network configuration. According to the company’s filing, the stack leverages gRPC‑based control channels and supports both Kubernetes‑native orchestration and traditional OpenStack environments, positioning the solution for both cloud‑scale hyperscalers and on‑premise AI labs.
Industry observers note that the infusion of venture capital into AI‑specific networking reflects a broader shift toward “verticalized” infrastructure, where generic compute, storage and networking components are being replaced by domain‑optimized equivalents. While SiliconANGLE did not provide a comparative market size, the timing aligns with a surge in AI model training expenditures that, per IDC, are projected to exceed $150 billion by 2027. Nexthop’s funding round, therefore, not only validates investor confidence in the niche but also underscores the growing demand for networking gear that can keep pace with the exponential growth of model parameters and data throughput.
Sources
This article was created using AI technology and reviewed by the SectorHQ editorial team for accuracy and quality.