Lisa Su Shows Off AMD’s High-End Chips Designed for A.I.’s ‘Yotta-Scale’ Future
At CES 2026, AMD CEO Lisa Su used the industry’s biggest stage to outline where the next era of A.I. is headed. The A.I. industry, she said during her keynote yesterday (Jan. 5), is entering the era of “yotta-scale computing,” driven by unprecedented growth in both training and inference. The constraint, Su argued, is no longer the model itself but the computational foundation beneath it.
“Since the launch of ChatGPT a few years ago, we’ve gone from about a million people using A.I. to more than a billion active users,” Su said. “We see A.I. adoption growing to over five billion active users as it becomes indispensable to every part of our lives, just like the cell phone and the internet today.”
Global A.I. compute capacity, she noted, is now on a path from zettaflops toward yottaflops within the next five years. A yottaflop is 1 followed by 24 zeros. “Ten yottaflops is 10,000 times more computing power than we had in 2022. There has never been anything like this in the history of computing, because there has never been a technology like A.I.,” Su said.
Yet Su cautioned that the industry still lacks the computing power required to support what A.I. will ultimately enable. AMD’s response, she said, is to build the foundation end-to-end—positioning the company as an architect of the next A.I. phase rather than a supplier of isolated components.
That strategy centers on Helios, a rack-scale data center platform designed for trillion-parameter A.I. training and large-scale inference. A single Helios rack delivers up to three A.I. exaflops, integrating Instinct MI455X accelerators, EPYC “Venice” CPUs, Pensando networking and the ROCm software ecosystem. The emphasis is on durability at scale, with systems built to grow alongside A.I. workloads rather than locking customers into closed, short-lived architectures.
AMD also previewed the Instinct MI500 Series, slated for launch in 2027. Built on next-generation CDNA 6 architecture, the roadmap targets up to a thousandfold increase in A.I. performance compared with the MI300X GPUs introduced in 2023.
Su stressed that yotta-scale computing will not be confined to data centers. A.I., she said, is becoming a local, everyday experience for billions of users. AMD announced an expansion of its on-device A.I. push with Ryzen AI Max+ platforms, capable of supporting models with up to 128 billion parameters using unified memory.
Beyond commercial products, Su tied AMD’s roadmap to public-sector priorities. Joined on stage by Michael Kratsios, President Trump’s science and technology advisor, who is slated to speak at CES later this week, she discussed the U.S. government’s Genesis Mission, a public-private initiative aimed at strengthening national A.I. leadership. As part of that effort, AMD-powered supercomputers Lux and Discovery are coming online at Oak Ridge National Laboratory, reinforcing the company’s role in scientific discovery and national infrastructure.
The keynote closed with a $150 million commitment to A.I. education, aligned with the U.S. A.I. Literacy Pledge—signaling that, in AMD’s view, sustaining yotta-scale ambition will depend as much on talent development as on silicon.
