AMD Announces New AI Chip Positioned to Challenge Nvidia’s Market Control

A notable difference between AMD’s MI300X and Nvidia’s H100 is the storage facility. While the AMD AI chip can use up to 192 GB of memory, Nvidia’s chip supports 120 GB. 

American multinational semiconductor company AMD (NASDAQ: AMD) has announced a new AI chip that is positioned to challenge its rival Nvidia (NASDAQ: NVDA). The company announced on Tuesday that MI300X is its most-advanced GPU for artificial intelligence. It added that the chip will begin to ship to some customers later in the year. While on an earnings call in May, AMD CEO Lisa Su said the AI chip will start shipping in larger quantities in 2024.

Meanwhile, Nvidia is currently the boss in the AI chips market. Analysts revealed that the software company’s AI chips dominated the market with more than 80% market share. The Nvidia H100 is roughly a $10,000 chip that has become the “workhorse” for AI professionals at the moment. In addition, the chip is also a major tool in the artificial intelligence industry. A notable difference between AMD’s MI300X and Nvidia’s H100 is the storage facility. While the AMD AI chip can use up to 192 GB of memory, Nvidia’s chip supports 120 GB.

In addition, AMD conducted a demo of its chip running Falcon, a 40 billion parameter model. LLMs for generative AI apps utilize a lot of memory as they operate on a growing number of calculations. Su explained that “model sizes are getting much larger,” and multiple GPUs are needed to run the latest LLMs. She noted that developers would not need as many GPUs with the added memory on AMD chips.

Will AMD Overtake Nvidia’s Dominance with New AI Chip?

AMD announcing its AI tool or “accelerators” appears to be the strongest challenge facing Nvidia right now. If developers and server markers shift their attention to the AI chip, it could be a massive addition to the company, which is popular for its traditional computer processor. This could also help the company’s strategy for long-term growth, as Lisa Su said to analysts and investors last week. The AMD CEO referred to AI as the company’s “largest and most strategic long-term growth opportunity”. She added:

“We think about the data center AI accelerator [market] growing from something like $30 billion this year, at over 50% compound annual growth rate, to over $150 billion in 2027.”

AI chips have become the go-to in the semiconductor industry as the sales of PCs have declined significantly. Meanwhile, PCs are major drivers of semiconductor processor sales.

Furthermore, AMD plans to offer an Infinity Architecture combining eight M1300X accelerators in a system. This is similar to what Nvidia and Google (NASDAQ: GOOGL) did with a single system that can connect eight or more GPS in just a box for AI apps.

Currently, AMD stock trades up 1.97% to $126.98 in the pre-market trading session.



Artificial Intelligence, Business News, Market News, News, Stocks


Ibukun is a crypto/finance writer interested in passing relevant information, using non-complex words to reach all kinds of audience.
Apart from writing, she likes to see movies, cook, and explore restaurants in the city of Lagos, where she resides.

Original

Spread the love

Related posts

Leave a Comment