Alibaba admits its 470,000 AI chips are inferior, sparking industry scrutiny
Photo by Compare Fibre on Unsplash
While Alibaba touted a home‑grown AI push, reality fell short: the Chinese giant disclosed—via its Q3 2026 earnings call—that its T‑Head unit has shipped 470,000 chips, which it now admits are inferior to rivals, according to Theregister.
Key Facts
- •Key company: Alibaba
Alibaba’s cloud division is already a cash‑cow, but the company’s ambition to turn its in‑house silicon into a competitive advantage has hit a reality check. During the Q3 2026 earnings call, T‑Head chief Yongming Wu disclosed that 470,000 AI chips have left the factory since January, yet he openly admitted the silicon “still lags behind foreign counterparts” (The Register). The admission is a rare moment of candor from a Chinese tech titan that has long touted self‑reliance as a shield against U.S. export bans on advanced accelerators.
The 470,000 figure could look impressive next to Nvidia’s six‑million Blackwell GPUs shipped last year, but the comparison is misleading because Alibaba’s flagship chip, the Pingtouge Zhenwu 810E, is a throttled version of a 2023‑vintage Hopper design and cannot match the raw performance of Nvidia’s or AMD’s latest accelerators (The Register). Wu emphasized that the answer isn’t raw speed but a “mutually optimized stack” that couples the 810E with Alibaba Cloud’s infrastructure and the company’s own Qwen large‑language model. In theory, tighter co‑design could squeeze out “cost‑effectiveness” and lower inference prices, a key differentiator Wu believes will let Alibaba compete on value rather than raw horsepower.
That strategy hinges on a supply‑security narrative that resonates in Beijing’s current tech climate. Wu pointed to “the unique circumstances currently facing the AI industry in China” as the driver for building a guaranteed in‑house compute pipeline (The Register). By owning the silicon, Alibaba hopes to sidestep the risk of future export restrictions that could cripple its cloud AI services. The gamble is reflected in Alibaba Cloud’s recent pricing moves: the division hiked prices by up to 34% this quarter, citing rising hardware costs and surging AI demand (The Register). The price bump underscores how tightly the cloud’s profitability is tied to the cost structure of its home‑grown chips.
Financially, the chip push has yet to translate into headline‑grabbing growth. Alibaba reported quarterly revenue of $40.7 billion, a modest 2 percent rise, and noted that without recent divestitures the growth would have been closer to 9 percent (The Register). Meanwhile, Alibaba Cloud’s revenue surged 36 percent year‑over‑year to $6.2 billion, and the company projects “$100 billion of annual cloud and AI revenue within five years” (The Register). The lofty target rests on the assumption that the 810E‑plus‑Qwen stack can deliver “superior value for money” and capture a larger slice of the AI services market, even as competitors race ahead with more powerful GPUs.
Analysts are watching whether Alibaba will eventually spin off T‑Head to unlock valuation upside. Wu brushed off speculation about an imminent IPO, saying there is no “definitive timeline” for a float (The Register). For now, the chip business remains a strategic arm rather than a profit centre, a status that may change if the co‑design approach can demonstrably shrink inference costs. Until then, the 470,000 chips serve as both a milestone and a reminder that building a home‑grown AI accelerator is a marathon, not a sprint—especially when the finish line is already occupied by Nvidia, AMD, and a growing cadre of open‑source challengers.
Sources
Reporting based on verified sources and public filings. Sector HQ editorial standards require multi-source attribution.