Technology
AMD Delivers Leading AI Performance with New Instinct MI325X Accelerators
AMD (NASDAQ: AMD) unveiled its latest AI infrastructure solutions, including the AMD Instinct™ MI325X accelerators, AMD Pensando™ Pollara 400 NIC, and AMD Pensando Salina DPU. These cutting-edge technologies are designed to elevate AI performance, offering market-leading memory capacity and bandwidth, as well as robust networking solutions. Supported by partners such as Dell Technologies, HPE, Lenovo, and Supermicro, the MI325X accelerators set new standards for Gen AI model training and inference in data centers.
Built on AMD’s CDNA™ 3 architecture, the Instinct MI325X accelerators boast 256GB of HBM3E memory, providing 1.8x more capacity and 1.3x more bandwidth than competitors. These features allow the accelerators to deliver superior performance on AI models like Mistral 7B and Llama 3.1 70B. Production shipments of the MI325X are expected in Q4 2024, with full system availability from major platform providers in Q1 2025.
In addition, AMD previewed its next-generation Instinct MI350 series accelerators, which promise a 35x improvement in inference performance over previous models, and will be available in the second half of 2025.
Advanced AI Networking Solutions
To address the growing demand for AI networking efficiency, AMD introduced the Pensando Salina DPU and Pollara 400 NIC. The Salina DPU, offering 2x generational performance, ensures high-speed data transfer for front-end AI networking, while the Pollara 400 NIC, the first Ultra Ethernet Consortium (UEC) ready AI NIC, manages backend networks, facilitating accelerator-to-accelerator communication. Both products will be available in 2025.
Expanded AI Software Ecosystem
AMD continues to strengthen its AI software capabilities with the ROCm™ 6.2 open software stack, enabling enhanced performance for Generative AI models. The latest updates bring up to 2.4x performance improvements on inference and 1.8x on training for popular large language models (LLMs) like Stable Diffusion and Llama.