Nvidia quietly constructs multibillion‑dollar venture to rival its own chip empire
Photo by BoliviaInteligente (unsplash.com/@boliviainteligente) on Unsplash
Draft: "While Nvidia’s fame still hinges on its flagship GPUs, TechCrunch reports the company’s quietly built networking division has become its second‑largest revenue stream, pulling $11 billion last quarter and shaping a multibillion‑dollar rival to its own chip empire."
Key Facts
- •Key company: Nvidia
Nvidia’s networking arm, born from the 2020 $7 billion acquisition of Mellanox, has exploded into a $11 billion quarterly revenue stream—up 267 % year‑over‑year—according to the company’s most recent earnings release cited by TechCrunch. The division now generates more than $31 billion for the full year, making it Nvidia’s second‑largest source of revenue behind its core compute business and dwarfing the firm’s original gaming segment, which remains roughly a third of the chip unit’s size. The rapid growth reflects a broader shift in data‑center architecture: AI workloads demand far more bandwidth and tighter coupling between GPUs, a need Nvidia addresses with its NVLink interconnect, Spectrum‑X Ethernet platform, and the InfiniBand switches that power “in‑network computing” across racks.
Kevin Cook, senior equity strategist at Zacks Investment Research, emphasized the scale of the networking segment in a TechCrunch interview, noting that “the $11 billion for the quarter is greater than Cisco’s networking business, almost as big as the full‑year estimates.” That comparison underscores how Nvidia’s in‑house networking stack is not merely an add‑on but a full‑fledged platform that rivals the market leaders in a space traditionally dominated by Cisco and Arista. The company’s strategy, as described by senior vice president of networking Kevin Deierling, hinges on treating the data center as a single unit of compute rather than a collection of isolated nodes. “Jensen said the data center is the new unit of computing. Networking is a lot more than just moving the smaller amounts of data between a compute node, it’s actually a foundation,” Deierling told TechCrunch.
The business’s product portfolio is tightly integrated with Nvidia’s AI hardware roadmap. NVLink enables high‑speed GPU‑to‑GPU communication within a rack, while the co‑packaged optics switches reduce latency and power consumption compared with traditional top‑of‑rack designs. Spectrum‑X provides the Ethernet backbone that can sustain the terabit‑per‑second throughput required for training large language models, and the InfiniBand switches support the “AI factory” concept—a data‑center built from the ground up to accelerate model training and inference. By controlling both the compute and the connective tissue, Nvidia can offer customers a vertically integrated solution that promises lower total‑cost‑of‑ownership and tighter performance guarantees, a claim echoed in the company’s earnings narrative.
Despite the financial heft, the networking division receives far less public attention than Nvidia’s GPU business, a disparity Deierling attributes in part to the team’s low‑key marketing approach. “People think of networking as just, ‘I got a printer, and I need to connect to it,’” he said, adding that the lack of fanfare may also be a self‑fulfilling prophecy. Yet analysts see the segment as a strategic moat. By embedding its own high‑speed interconnects into the data‑center fabric, Nvidia reduces customers’ reliance on third‑party networking vendors and creates a sticky ecosystem that can drive future GPU sales. The synergy is evident in the company’s recent product announcements, where new AI accelerators are paired with upgraded InfiniBand and Spectrum‑X hardware, reinforcing the narrative that the data center is the new unit of computing.
Looking ahead, Nvidia appears poised to double down on the networking play. The company’s roadmap hints at further integration of AI‑specific switches and expanded co‑packaged optics, while the broader market signals continued demand for bandwidth‑rich infrastructure as AI models grow in size and complexity. If the current growth trajectory holds, the networking division could eclipse the traditional networking giants in annual revenue within a few years, effectively creating a multibillion‑dollar empire that rivals Nvidia’s own chip business—just as Jensen Huang envisioned when he first pushed the firm into AI‑specific silicon a decade ago.
Sources
Reporting based on verified sources and public filings. Sector HQ editorial standards require multi-source attribution.