AMD Unveils Instinct MI350P: PCIe Version Delivers Open-Source AI Compute to Existing Servers

AMD Announces Instinct MI350P PCIe Add-In Card for Open-Source AI Workloads

Santa Clara, CA – February 2025 – AMD today announced the Instinct MI350P, a PCIe add-in card designed to bring high-performance open-source AI and compute capabilities to existing PCIe 5.0 air-cooled servers. The move marks a significant shift for the MI350 series, which until now relied on the Open Accelerator Module (OAM) form factor.

AMD Unveils Instinct MI350P: PCIe Version Delivers Open-Source AI Compute to Existing Servers

“The MI350P is a direct response to customer demand for flexible AI acceleration without requiring new infrastructure,” said Dr. Lisa Su, CEO of AMD. “Data centers can now slot this card into their current servers and instantly tap into AMD’s latest compute innovations.”

Industry analyst Patrick Moorhead of Moor Insights & Strategy called the announcement “a game-changer for enterprise AI adoption. It removes the friction of OAM-based integration and opens the door for faster, lower-cost deployment of open-source models.”

Background

The AMD Instinct MI350 series originally launched in the OAM form factor, optimized for dense, liquid-cooled clusters. However, many data centers operate air-cooled racks with PCIe slots—especially those running existing PCIe 5.0 systems.

Enter the MI350P: a standard PCIe add-in card that slots into any compatible air-cooled server. It delivers the same compute cores as the MI350 OAM variant but in a more accessible form factor. This aligns with AMD’s commitment to open-source AI ecosystems, including ROCm support.

The company also confirmed the Instinct MI400 series remains on track for later this year, targeting next-generation AI workloads.

What This Means

For IT managers, the MI350P means they can upgrade compute capabilities without forklift server replacements. Air-cooled PCIe 5.0 racks—common in mid-to-large enterprises—can now run cutting-edge AI inference and training locally.

The card supports 8-way GPU configurations for multi-instance environments, and is fully compatible with AMD’s ROCm open-source software stack. This allows developers to run PyTorch, TensorFlow, and other AI frameworks unchanged.

“We see this as a bridge to the future,” added Forrest Norrod, SVP and GM of AMD’s Data Center Solutions Group. “The MI350P lets customers start their AI journey on their terms, then seamlessly migrate to MI400 OAM clusters when ready.”

Pricing and availability are expected in Q3 2025. Early adopters include cloud service providers and academic research labs, according to AMD.

Key benefits at a glance:

AMD also emphasized the MI350P’s role in open-source AI, contrasting with proprietary accelerator ecosystems. “We believe in giving developers choice—and the MI350P embodies that,” Su said.

Tags:

Recommended

Discover More

Why Gamers Are Ordering New York Pizza They'll Never Eat for a Video Game SkinColor Palette Alternatives for Vanilla CSS: A Q&A GuideBritish Cybercriminal 'Tylerb' Pleads Guilty in Massive SIM-Swap and Phishing Scheme10 Key Takeaways from ThoughtWorks' 34th Technology RadarAI Job Apocalypse Accelerates: Entry-Level Hiring Plummets as Industry Leaders Warn of 50% Wipeout in Five Years