NEWS  /  Analysis

Qualcomm Stock Soars Over 20% Midday after First Rack-Level AI Chips for Datacenters Unveiled

By  LiDan  Oct 28, 2025, 4:44 a.m. ET

Qualcomm's AI200 and AI250 chip-based accelerator cards, and racks are expected to be commercially available in 2026 and 2027 respectively. Saudi Arabian AI company Humain will be the first customer to deploy these offerings with a goal of 200 MWs worth of systems.

AsianFin -- Qualcomm Technologies Inc. shares soared as much as nearly 22%, hitting their biggest intraday gain since 2019, and closed 11.1% higher on Monday. The stock rallied to its highest close since July 23, 2024 after the titan focusing on chips for mobile devices unveiled products that will feed into the datacenter build-out, effectively compete with the artficial intelligence (AI) accelerator leaders Nvidia Corporation and Advanced Micro Devices Inc. (AMD).


Credit:Qualcomm

Credit:Qualcomm

Qualcomm announced the launch of AI200 and AI250 chip-based accelerator cards, and racks, which are expected to be commercially available in 2026 and 2027 respectively. These data cetner chips are built off AI parts in Qualcomm’s smartphone chips called Hexagon neural processing units (NPUs). The AI200 and AI250 solutions deliver rack-scale performance and superior memory capacity for fast data center generative AI inference at industry-leading total cost of ownership (TCO), said the chipmaker.

Qualcomm’s AI200 introduces a purpose-built rack-level AI inference solution designed to deliver low (TCO while optimized performance for large language models (LLMs) and multimodal models (LMMs) inference and other AI workloads. It supports 768 GB of LPDDR per card for higher memory capacity and lower cost, enabling exceptional scale and flexibility for AI inference.

AI250 will roll out with “an innovative memory architecture based on near-memory computing”, Qualcomm said. The solution can magnificantly boost efficiency and performance for AI inference workloads by delivering greater than 10 times higher effective memory bandwidth and much lower power consumption. This enables disaggregated AI inferencing for efficient utilization of hardware while meeting customer performance and cost requirements.

The rack solutions for both AI200 and AI250 include direct liquid cooling; the Peripheral Component Interconnect Express, or PCIe, standard for high-speed interface connection for scaling up; Ethernet for scaling out; and "confidential computing" to support running AI models securely, Qualcomm said.

Qualcomm said its new chips are focusing on inference, or running AI models, instead of training. The company would also sell its AI chips and other parts separately, especially for clients such as hyperscalers that prefer to design their own racks, said Durga Malladi, Qualcomm’s general manager for Technology Planning, Edge Solutions & Data Center. He said other AI chip companies could even become clients for some of Qualcomm’s data center parts, such as its central processing unit (CPU).

Qualcomm also said Saudi Arabian AI company Humain will be the first customer to deploy its AI200 and AI250 rack solutions starting next year, with a goal of 200 megawatts (MWs)' worth of systems.

While Qualcomm is not new to selling AI accelerators, having announced its first generation in 2019, the launches are “the first to bring a rack-scale architecture to the company's offering,” Bernstein Research’s senior analyst Stacy Rasgon said of the AI200 and AI250 in a note on Monday.  "For whatever reason the company's earlier efforts drew little to no attention but today's announcement appears to be garnering more,"  Rasgon wrote, pointing to the stock's move, which he said is "significantly outperforming even on a strong market day."

With rack-level power consumption of 160 kilowatts, Humain's plan for 200 MWs of Qualcomm's systems translates to about 1,250 racks, Rasgon estimated, noting that it's not clear how many AI200 cards will be in each rack, or how much each rack costs.

Qualcomm’s 200-MW deal with Humain "is dwarfed by many of the other deals"such as those between OpenAI with Nvidia and AMD, "but for now we'll take it", according to Rasgon. He also expected Qualcomm to contact OpenAI CEO Sam Altman “if the opportunity were to arise.”

Please sign in and then enter your comment