It’s safe to say that Qualcomm (QCOM) just threw down a massive AI gauntlet.
On Oct. 27, the 40-year-old chipmaker showed off its brand new AI200 (2026) and AI250 (2027) accelerator cards, complemented by full liquid-cooled rack systems tailor-made for handling AI inference at data-center scale.
Drawing 160kW in energy for its rack, each of Qualcomm’s new cards supports an impressive 768GB of memory backed by direct liquid cooling.
These chips are engineered to efficiently handle large language and multimodal workloads, with Qualcomm already securing its first major customer, Humain from Saudi Arabia (a 200MW deployment starting in 2026).
Investors were ecstatic with Qualcomm’s AI foray, hardly waiting for the ink to dry.
The company’s stock spiked 11.1% to $187.68, skyrocketing as much as 20% intraday to a 15-month high. The rally feeds into what’s been a 22% year-to-date climb for the tech behemoth.
However, amid the euphoria, one top analyst’s verdict feels like a quiet shock, suggesting that the real story might be far deeper than investors are framing at this point.
Top analyst reminds Wall Street: This isn’t Qualcomm’s first AI rodeo
Bernstein’s Stacy Rasgon feels the “newcomer” hype around Qualcomm’s latest AI breakthrough isn’t warranted.
On CNBC Oct. 28, the veteran tech analyst, who has an outperform rating and a $185price target on the stock, showed that Qualcomm has been selling AI accelerators for years.
“These look like next-generation parts,” Rasgon said of Qualcomm’s new AI inference systems. “People forget they’ve actually sold AI accelerators for quite a while.”
More Nvidia:
- IonQ CEO just threw a curveball at Nvidia
- Why Nvidia’s Vera Rubin may unleash another AI wave
- Nvidia just scored a massive AI win, but CEO Huang has regrets
Clearly, this isn’t Qualcomm’s first AI chip.
It first launched the Cloud AI 100 inference accelerator back in 2019-2020 (and later the Cloud AI 100 Ultra). Additionally, it released AI NPUs for years, which powered popular Snapdragon chips for phones and PCs.
However, this time the scale and targets are different.
Qualcomm is evolving its offerings from edge-level inference to rack-scale, data-center AI, targeting a much larger, enterprise-grade opportunity.
Related: AMD quietly cracks open quantum opportunity
That incredible evolution matters immensely, as Rasgon feels that inference, not training, is exactly where the real volume lies, and where apps can potentially scale to millions of users.
Also, the total addressable market is likely to be a lot “more fragmented” than training, creating more room for players like Qualcomm to grab a bigger slice of the pie.
Takeaways on Qualcomm’s AI breakthrough:
- Not Qualcomm’s first AI chip: Analyst Stacy Rasgon reminds investors the company’s been in the AI accelerator game since 2019, long before the AI inference announcement.
- From phones to data centers: The new AI systems mark a massive five-year leap from edge-level devices to rack-scale inference.
- Massive upside optionality: Rasgon says Wall Street still values Qualcomm’s AI business as having plenty of room for re-rating.
The AI inference gamble could redraw the chip map
Qualcomm’s latest AI move has everything to do with redefining where the real AI volume lies.
Nvidia still rules the roost in terms of training, but with models scaling to millions of users, the compute then shifts squarely on inference, which is where the trained models actually run.
That’s exactly the space Qualcomm is looking to dominate.
Related: Bank of America finds surprise twist in AI, job market
It’s chasing the TCO wedge, where the goal is to cut costs and power at rack scale in tempting hyperscalers and new AI clouds to look beyond GPUs. If it can execute well, Qualcomm may be able to evolve from “mobile-efficient” to “rack-efficient,” while becoming a juggernaut in performance-per-watt across inference workloads.
Financially, the impact comes much later.
Qualcomm’s AI data center sales currently remain significantly smaller compared to its potent QCT engine (handsets, auto, IoT). Also, its management had hinted that a powerful hyperscaler win could start yielding sales in fiscal 2028, expanding the TAM and earnings mixlater, not now.
Nonetheless, StreetProveteran fund manager Chris Versace added a surprise twist Oct. 27. Following Qualcomm’s blockbuster AI chip reveal, which sent shares soaring, he lifted his price target to $205, but also ended up trimming holdings, citing a “prudent register ring.”
Versace called the move a “2026 event,” adding long-term upside, but he feels near-term caution is advised.
Related: Legendary fund manager Ray Dalio now has an AI clone


