Amazon launches in‑house AI ASIC, fuels $110B OpenAI funding round, challenges Nvidia
Photo by Thibault Penin (unsplash.com/@thibaultpenin) on Unsplash
Amazon unveiled its own AI ASIC on Thursday, a move that helped drive a $110 billion funding round for OpenAI in which Amazon, Nvidia and SoftBank all invested, Reuters reports.
Quick Summary
- •Amazon unveiled its own AI ASIC on Thursday, a move that helped drive a $110 billion funding round for OpenAI in which Amazon, Nvidia and SoftBank all invested, Reuters reports.
- •Key company: Amazon
- •Also mentioned: OpenAI, Nvidia
Amazon said its new custom AI ASIC will cut training costs for large‑scale models by up to 30%, according to a Reuters briefing. The chip, built in Amazon’s Austin labs, combines Amazon‑designed matrix cores with a proprietary interconnect that the company claims delivers higher throughput than comparable Nvidia GPUs. Executives told Reuters the design targets both Amazon’s internal AI workloads and external customers via the AWS cloud, positioning the silicon as a direct alternative to Nvidia’s H100 and upcoming Hopper series.
The ASIC launch coincided with a $110 billion funding round for OpenAI, in which Amazon committed $50 billion, Nvidia contributed $30 billion, and SoftBank added $30 billion, Reuters reported. The massive capital infusion will fuel OpenAI’s next generation of models and expand its compute infrastructure, with Amazon expected to integrate the new chips into its AWS AI services. The round marks the largest single‑entity investment in an AI startup to date, dwarfing previous mega‑fundraises such as the $6.6 billion round for OpenAI in early 2024.
Amazon’s chip strategy reflects a broader push to internalize AI hardware, a move Reuters highlighted as part of the “silicon wheel” reinvention trend. By designing its own processors, Amazon aims to reduce reliance on Nvidia’s pricing and supply constraints, while offering customers a lower‑cost compute tier. The company said the ASIC will be available on AWS by Q4 2024, giving developers access to “cheaper, faster” training cycles without leaving the cloud ecosystem.
Nvidia, which remains the dominant supplier of AI GPUs, acknowledged the partnership with Amazon on the new servers but stopped short of commenting on competitive implications, Reuters noted. The collaboration suggests Nvidia’s technology will still underpin parts of Amazon’s offering, even as the latter seeks to carve out its own hardware niche. Industry observers see the dual‑track approach as a hedge: Amazon can leverage Nvidia’s proven performance while differentiating with its own silicon for cost‑sensitive workloads.
Analysts cited by Reuters Breakingviews warned that the $110 billion OpenAI round could reshape the AI market’s power dynamics. With Amazon now a major shareholder and hardware provider, the company could lock in a pipeline of OpenAI services on AWS, potentially sidelining rivals that lack comparable cloud‑chip integration. The funding also gives OpenAI the resources to accelerate research, which may intensify competition for talent and compute across the sector.
The rollout underscores Amazon’s ambition to become a full‑stack AI player—from custom chips to cloud services and foundational models. If the ASIC delivers the promised cost savings, it could pressure Nvidia’s pricing and accelerate the shift toward diversified AI hardware ecosystems, Reuters said. The next few quarters will reveal whether Amazon’s silicon gamble can translate into market share gains against the entrenched GPU leader.
This article was created using AI technology and reviewed by the SectorHQ editorial team for accuracy and quality.