Enroll Course

100% Online Study
Web & Video Lectures
Earn Diploma Certificate
Access to Job Openings
Access to CV Builder



online courses

AMD Enters the AI Chip Race with Rival to Nvidia’s Blackwell

business . 

AMD has introduced its latest artificial intelligence chip, the Instinct MI325X, which targets Nvidia’s dominance in the data center graphics processor (GPU) market. This new product will begin production by the end of 2024, as announced during an event that showcased AMD's ambitious plans in the AI chip sector. The release of the MI325X comes at a critical time as the demand for AI processing power, fueled by advanced generative AI models like OpenAI’s ChatGPT, has surged, prompting increased competition in the market.

Nvidia has historically maintained a significant lead in the data center GPU space, controlling around 90% of the market, while AMD holds second place. With the launch of the MI325X, AMD aims to capture a greater portion of this market, which is projected to grow to $500 billion by 2028. The company sees this as an opportunity to compete head-to-head with Nvidia and capitalize on the increasing demand for AI chips across various industries. CEO Lisa Sum emphasized the continued growth in AI investment, which has exceeded expectations and created a favorable environment for companies like AMD to expand their presence.

Although AMD did not announce new major cloud or internet customers for the MI325X during the event, the company has previously secured partnerships with major players like Meta and Microsoft, which already use its AI GPUs. Additionally, OpenAI leverages AMD’s chips for some of its applications. However, AMD did not disclose the pricing details for the MI325X, which is typically sold as part of a complete server solution.

The launch of the MI325X is part of AMD's broader strategy to release new AI chips on an annual basis, allowing the company to better compete with Nvidia’s offerings. This product follows the MI300X, which began shipping late in 2023, and will soon face off against Nvidia’s upcoming Blackwell chips, set to start shipping in 2025. AMD has outlined its future AI chip roadmap, with the MI350 slated for release in 2025 and the MI400 in 2026.

The competitive landscape for AI chips has drawn significant interest from investors, particularly as the AI boom continues to drive demand. While Nvidia’s stock has surged by over 175% in 2024, AMD’s stock has only increased by 20%, reflecting the challenges AMD faces in closing the gap with its rival. One of AMD’s key hurdles is overcoming the dominance of Nvidia’s CUDA programming language, which has become the standard for AI developers. This creates a barrier for developers looking to switch from Nvidia to AMD chips. To address this, AMD has been working to enhance its ROCm software, making it easier for developers to port their AI models to its platform.

In terms of performance, AMD has positioned the MI325X as particularly competitive for tasks where AI models are generating content or making predictions, thanks to its advanced memory design. According to AMD, the MI325X delivers up to 40% more inference performance on Meta’s Llama 3.1 AI model compared to Nvidia’s H200 chip. This positions AMD well for specific AI workloads, although Nvidia continues to lead in broader data processing tasks.

While GPUs for AI applications are a major focus for AMD, the company’s core business remains centered around its central processing units (CPUs). During the same event, AMD also launched its EPYC 5th Gen CPU series, designed to handle a wide range of data center workloads, including AI processing. These new CPUs are available in various configurations, from low-cost, low-power chips to high-performance processors with up to 192 cores, aimed at supercomputing applications.

AMD’s new CPUs are particularly well-suited for feeding data into AI workloads, which require both CPU and GPU power. Nearly all GPUs rely on a CPU to operate, making AMD’s integrated approach crucial for optimizing AI performance. With its EPYC 5th Gen CPUs, AMD hopes to challenge Intel which still dominates the data center CPU market with its Xeon line of processors. However, AMD has made significant strides, and its data center sales more than doubled in the second quarter of 2024, reaching $2.8 billion, with AI chips accounting for about $1 billion of that total.

As AI and data center technologies continue to evolve, AMD’s strategy involves strengthening its position in both the GPU and CPU markets, aiming to capture a larger share of the growing demand for AI and data processing solutions. By improving its AI accelerators and expanding its CPU offerings, AMD is well-positioned to compete with both Nvidia and Intel in the increasingly competitive semiconductor industry.

Related Courses and Certification

Full List Of IT Professional Courses & Technical Certification Courses Online
Also Online IT Certification Courses & Online Technical Certificate Programs