"I am encouraged with the progress that we're making on hardware and software and certainly with the customer set,” said Lisa Su. The post 2024 is the Year of AMD appeared first on Analytics India Magazine.
AMD has surely taken a stance to bolster itself for 2024. Starting from its partnership with leading companies such as Microsoft, Oracle, and Meta for using MI300X; spearheading AI PC revolution with Ryzen AI; and software play with ROCm; the company is super optimistic about the future, tapping into the AI boom.
“I think what we’ve seen is the adoption rate of our AI solutions has given us confidence in not just the Q4 revenue number, but also sort of the progression as we go through 2024,” said Lisa Su, the CEO of AMD.
Su announced at a recent earnings call that AMD is expecting a revenue of $400 million from GPUs in the fourth quarter, and exceed $1 billion by the end of 2024. “This growth would make MI300 the fastest product to ramp to $1 billion in sales in AMD history,” she said.
Su expects the data centres AI market will now be around $400 billion by 2027 — a 2.7 times more than her previous estimation of $150 billion in the same time period.
Su highlighted that the market is huge and there will be multiple winners in this market. “From our standpoint – we’re playing to win and we think the MI300 is a great product, but we also have a strong road map beyond that for the next couple of generations.”
“But overall, I would say that I am encouraged with the progress that we’re making on hardware and software and certainly with the customer set,” she said.
Ryzen AI PCs
Leading OEMs such as Acer, Asus, Dell, HP, Lenovo, and Razer are set to feature the Ryzen 8040 Series processors announced at the event. This was along with Ryzen AI 1.0 software for seamless deployment of models on the hardware making it more convenient for users to harness the power of AI in various computing scenarios.
The significance of the mobile phone CPU market is likely to grow as well, especially with the presence of formidable competitors. AMD’s emphasis on Chiplet technology has proven beneficial, evident in its lower waste rate compared to Intel’s monolithic approach. Intel has also acknowledged this advantage and introduced its initial Chiplet lineup, Meteor Lake, which is still new.
In PCs, there are now more than 50 notebook designs powered by Ryzen AI in the market, said Su. “We are working closely with Microsoft on the next generation of Windows that will take advantage of our on-chip AI Engine to enable the biggest advances in the Windows user experience in more than 20 years.”
AMD still holds the upper hand due to its longer and more profound experience with this technology.
Then it’s about GPUs
“People believe that AMD GPUs are not that suited for machine learning, but the company has been increasingly proving everyone wrong,” Jungwhan Lim, head of AI group at Moreh told AIM. AMD GPUs seem to have witnessed a surge in community adoption, proving their mettle in the field of AI.
The testimonial from Moreh is just one. There are several AI companies and startups that are partnering with AMD to prove its prowess in the market. Databricks, the company giving close competition to big players in the AI race, has been testing AMD GPUs for the whole of 2023, and it revealed the secret only later. Same is the case with Lamini, another partner of AMD.
All of these companies narrate a similar story of how AMD is definitely rising up because there is a shortage of GPUs in the market. Gregory Diamos, co-founder of Lamini, said, “we have figured out how to use AMD GPUs, which gives us a relatively large supply compared to the rest of the market.”
Not just GPUs, AMD is strategically partnering with the Ultra Ethernet Consortium (UEC) to enhance its inter-chip networking technology and challenge NVIDIA’s dominance. The collaboration involves incorporating Broadcom’s next-gen PCIe switches, supporting AMD’s Infinity Fabric technology for improved data transfer speeds between CPUs.
The open software approach
Su also highlighted that Lamini is enabling enterprise customers to easily deploy production-ready LLMs fine-tuned for their specific data on Instinct MI250 GPUs with minimal code changes. Lamini co-founders claimed that they have the most competitive perf/$ on the market right now, “because we figured out how to use AMD GPUs to get software parity with CUDA, trending beyond CUDA.”
Then comes the software approach. It has been intensively discussed that NVIDIA’s real moat for AI has always been CUDA, its parallel computing framework. But AMD is closing in slowly, but surely with ROCm, and it has made its focus. “As important as the hardware is, software is what really drives innovation,” Lisa Su said, talking about the ROCm.
For this, the company has partnered with Nod.ai and Mipsology, which is helping the company build its software stack on par with NVIDIA.
“ROCm runs out of the box from day one,” said Ion Stoica, co-founder of Databricks, highlighting it was very easy to integrate it within Databricks stack after the acquisition of MosaicML, with just a little optimisation. “We have reached beyond CUDA,” said Sharon Zhou, the co-founder of Lamini, and how ROCm is production-ready.
All of this is while NVIDIA is planning to release GH200 AI accelerators starting the first quarter of 2024. Though on performance, CUDA can be touted as better than ROCm, but AMD is hell-bent on making its offering better.
The post 2024 is the Year of AMD appeared first on Analytics India Magazine.