LAS VEGAS, January 6, 2026 — Dr. Lisa Su, Chair and CEO of AMD, delivered one of the most anticipated keynotes at CES 2026, using the global stage to outline AMD’s expansive strategy for artificial intelligence — from data centres to PCs and edge devices — while situating the company at the forefront of the next major computing revolution.
In her opening address — the official kickoff keynote for the CES show floor — Dr. Su declared that the world has entered the “YottaScale era” of computing: an age where AI workloads will demand computing capabilities measured in yottaflops — orders of magnitude greater than today’s zettaflop-level systems. This dramatic quantification underscores how much computing power will be needed to support future AI applications spanning science, health, mobility and human-machine interaction.
“AI is for everyone — and it’s going to be everywhere,” Su said, framing AI not as a niche technology reserved for specialised research but as a pervasive force reshaping every industry and device.
Charting a Path to “AI Everywhere, for Everyone”
Dr. Su’s keynote emphasised AMD’s end-to-end AI portfolio and cross-industry partnerships. The company showcased its evolving infrastructure offerings, including:
- Helios rack-scale AI platform — built on AMD’s Instinct MI455X GPUs and EPYC CPUs, designed to power yotta-scale AI workloads and offer up to multiple exaflops of performance in a single rack.
- New Instinct MI440X enterprise AI processors and a preview of the next-generation MI500 Series GPUs — illustrating AMD’s commitment to serving both traditional enterprise environments and cloud-scale data centres.
- Ryzen AI platforms, including the latest Ryzen AI 400 and PRO 400 Series processors — targeted at AI PC and edge AI segments — together with the Ryzen AI Halo developer platform that opens up new possibilities for AI-driven computing at the endpoint.
Dr. Su pointed to the rapid global adoption of AI — noting how usage exploded from roughly one million active users at ChatGPT’s launch to over one billion today — to highlight the urgency of expanding compute horizons. AMD projects that AI adoption could eventually extend to five billion users worldwide, a scale that will demand not only massive compute resources but also intelligent hardware and software collaboration.
Driving Compute Power and Ecosystem Collaboration
A central theme of Su’s address was how AMD intends to meet surging compute requirements. She described a future where large-scale AI models capable of advanced reasoning and simulation will rely on vastly scaled infrastructure. This includes both high-performance data-centre GPUs and optimized PC-level AI processors that bring inferencing and productivity capabilities closer to the user.
To underline AMD’s role in supporting the broader AI ecosystem, Dr. Su was joined on stage by partners — including OpenAI’s Greg Brockman — who emphasized the importance of addressing compute bottlenecks as generative AI models grow in capability and sophistication.
AI Across Devices, Experiences and Industries
Dr. Su’s keynote went beyond silicon and compute performance, framing AMD’s vision around accessible AI integration:
- AI in data centres and servers to drive enterprise-scale innovation.
- AI on consumer and creator devices, from laptops to desktops, enabling local inferencing and smarter experiences.
- AI at the edge and in embedded systems, making everyday products more intuitive and context-aware.
This holistic approach highlighted AMD’s belief that computing must evolve on multiple fronts — from backbone infrastructure supporting massive models to the devices people use daily — to truly fulfil the promise of “AI everywhere, for everyone.”
Dr. Su’s keynote set a clear directive for 2026 and beyond: the AI transformation is no longer confined to software or data centres — it is becoming an integrated, ubiquitous force across industries and devices. AMD’s strategic positioning — underpinned by ambitious yotta-scale compute goals and a broad portfolio spanning data centre to personal computing — signals how fiercely competitive and foundational the AI race has become.
