On Thursday, AMD officially released its newest AI chip: Instinct MI325X. The company has positioned this product directly to take on Nvidia’s dominant data center GPUs. The MI325X pits against the upcoming Blackwell chips from Nvidia, whose shipment will begin early next year. This is AMD’s strategic move to seize more considerable shares of the booming AI chip space that Bloomberg places at $500 billion by 2028.
AMD has always been second in the race in data centre GPU. For the MI325X, the company is looking to force the hand of Nvidia. After all, Nvidia currently enjoys over 90% market share here. This could shake the price of Nvidia’s products, which have been enjoying high margins with the soaring demand for its GPUs, largely because of AI applications. Chief among them is OpenAI’s ChatGPT.
The demand for AI has been higher than expected and investments grow super-duper quickly in the industry,” said CEO Lisa Su. AMD did not introduce new major cloud customers, but it does partner with Meta and Microsoft and it is so far supplying its AI chips in some OpenAI applications.
The biggest challenge AMD is facing currently is Nvidia’s proprietary CUDA programming language, which has become the new standard for AI developers. To avoid getting locked into Nvidia’s own ecosystem, AMD has been working on increasing capabilities of ROCm software to make it easy for developers who are already on the Nvidia side. The MI325X advances its performance to up to 40% more than that of the Nvidia H200 by running Meta’s Llama AI models with much advanced memory.
The company also debuted its 5th Gen EPYC CPUs, further solidifying its place within the data center. These announcements suggest that AMD is planning to go head-to-head with both Nvidia and Intel rather assertively in the AI and data center spaces