NEWS  /  Analysis

Broadcom Launches Nvidia Rivaling Networking Chip as $10 Billion Customer Remains a Mystery

By  LiDan  Oct 15, 2025, 2:19 a.m. ET

Broadcom CEO at an earnings call last month said the company had secured $10 billion in orders for its custom AI chips from a fourth customer. Given the major order, Broadcom expected its outlook for fiscal year 2026 will improve "significantly" from what it had indicated in the previous quarter.

AsianFin -- Broadcom Inc. is further challenging Nvidia Corporation in artificial intelligence (AI) arms race.

AI Generated Image

AI Generated Image

Broadcom launch a new Nvidia rivaling networking chip called Thor Ultra, enables computing infrastructure operators to deploy far more chips than they otherwise could. Networking chips help data center operators move information around inside a facility.

Thor Ultra is the industry’s first 800G AI Ethernet Network Interface Card (NIC), capable of interconnecting hundreds of thousands of XPUs to drive trillion-parameter AI workloads. By adopting the open Ultra Ethernet Consortium (UEC) specification, Thor Ultra gives customers the ability to scale AI workloads with unparalleled performance, and efficiency in an open ecosystem, said Broadcom. XPUs are custom-designed high-performance processing units for AI datacenters.

“Thor Ultra delivers on the vision of Ultra Ethernet Consortium for modernizing RDMA for large AI clusters,” said Ram Velaga, senior vice president and general manager of the Core Switching Group at Broadcom. “Designed from the ground up, Thor Ultra is the industry’s first 800G Ethernet NIC and is fully feature compliant with UEC specification.”

Thor Ultra supports data speeds of 800 gigabits per second, allowing  computing infrastructure operators to build and run the large models used to power AI applications such as ChatGPT.

Thor Ultra outperforms traditional RDMA, or remote direct memory access, as it introduced  a groundbreaking suite of UEC-compliant, advanced RDMA innovations including: packet-level multipathing for efficient load balancing, out-of-order packet delivery directly to XPU memory for maximizing fabric utilization, selective retransmission for efficient data transfer, and programmable Receiver-based and Sender-based congestion control algorithms.

The new chip came a day after Broadcom renewed the market speculation about its big customer.

OpenAI is not the mystery $10 billion customer that it announced during its earnings call last month, Charlie Kawwas, president of the semiconductor solutions group at Broadcom, said in a CNBC interview on Monday.

Broadcom and OpenAI on Monday announced they agreed on a collaboration for 10 gigawatts (GWs) of custom AI chips and computing systems over the next four years. OpenAI will design the its own chips,  which will allow it to embed what it has learned from developing AI models directly into the hardware that underpins future systems. Broadcom will help develop  and deploy the systems, which are built on its Ethernet and other connectivity solutions. 

Kawwas in the interview discussed with OpenAI’s president, Greg Brockman about two companies’ new deal, and revealed OpenAI is not the customer which brought Broadcom a massive order worth of $10 billion.  “I would love to take a $10 billion [purchase order] from my good friend Greg,” Kawwas said. “He has not given me that PO yet.” 

Kawwas didn’t specify who is scooping up $10 billion worth of Broadcom’s products. The chipmaker didn’t comment on Kawwas’ remarks.

Broadcom had secured $10 billion in orders for its custom AI chips, which it calls XPUs, from a fourth customer, the CEO Hock Tan said at an earnings conference on September 4. “One of these prospects released production orders to Broadcom, and we have accordingly characterized them as a qualified customer for XPUs,” Tan said. He added the orders will ship next year.

Given the major order from the new customer who has “immediate and pretty substantial demand” , Broadcom upgraded its forecast for AI revenue next year, according to Tan. He told investors the chip designer’s outlook for fiscal year 2026 will improve “significantly” from what it had indicated last quarter, and a growth rate will accelerate in a “fairly material”way. Tan previously had said 2026 would witness the AI revenue grow at the same rate as the current year, namely, a rate between 50% and 60%.   

While Tan didn’t namethe new customer, the Financial Times following Broadcom’s financial results reported that OpenAI was the latest customer. The client was identified as OpenAI, Bloomberg echoed, citing people with knowledge of the matter. According to the source, Broadcom’s custom chip program is intended to ease supply constraints on Nvidia Corporatioin’s AI chips, and to allows OpenAI to develop processors tailored for its models.   

Please sign in and then enter your comment