For years, we’ve heard about how artificial intelligence will change the way we use phones and computers. Instead of tapping on apps and clicking on programs ourselves, AI will look through our devices itself to answer our questions. These so-called AI agents have felt just out of reach for years — but with the latest generation of phone chips, that might finally change.
At Snapdragon Summit 2025 in Maui, I sat down with Qualcomm’s AI chief Durga Malladi, who explained how the company’s newly announced Snapdragon 8 Elite Gen 5 chip will power what could be the first phones capable of running large language models approaching true AI agent performance.
“I believe we are on the cusp of more personalized assistants coming,” Malladi said. He referenced two LLMs, Paage.AI and Anything.AI, that held demos during the summit, saying that both could be used as personal assistants.
“It actually harnesses any of the local documents that you have and pictures and whatnot, everything that’s stored on-device, and just taps into it and gets some additional information,” Malladi noted.
Lining up the right LLMs and hardware is part of paving the way for AI agents. But getting people to use them is another story.
At Snapdragon Summit 2025, Qualcomm says its new chips make products ready for AI agents.
Are people ready to use AI agents?
Last year, the first laptops packing Qualcomm’s Snapdragon X Elite chips for PCs debuted. Despite not having a dynamically different UI, they still had the rudiments of generative AI interfaces with ChatGPT-like prompts. Malladi gave an example: asking his computer what the last email he sent to Qualcomm CEO Cristiano Amon was. He’s now gotten used to using prompts to control his computer.
“I think people will get used to that kind of behavior, and there’s no going back from there once you start getting used to it,” Malladi said.Â
But it’ll take a lot of familiarization for people to understand how to use AI agents and become accustomed to them.
“As people get more used to it,” Malladi said of AI, “you realize how much time we spend on actually doing a lot of mundane things — it’s a distraction from the main task that you have in mind.”
AI agents could help people become power users of their devices without digging deep into their settings. How often do we discover new features we didn’t know our phone had, Malladi pointed out — they’re such complex devices that it’s no wonder we don’t know how to use them to their full potential. Easing access to more capabilities by simply asking AI agents to do them could be an accessibility boon, too, Malladi said.
The vision is clear: Automate the mundane and cover for moments when human error unwittingly introduces new problems. Yet generative AI has had its own struggles with accuracy, from ChatGPT including non-existent books on a summer reading list to hallucinating about court cases that never happened to Google’s AI Overviews telling people to eat rocks. Though Malladi asserted these were issues on devices with LLMs two years ago; guardrails have been put in place since then — both by Qualcomm at the chip level and at the LLM level, he said.
We’ll have to wait and see if Malladi is right about AI agents. The technology is already here, showing up on phones and laptops using Qualcomm’s chips. And if it’s this year or next, AI agents’ potential impact is significant: They’ll change how we use the machines that let us interact with and contribute to the world.
“Whether it’s school and college-going kids who are growing up in this agentic AI environment, I wonder whether they’ll ever know what it used to be before that,” Malladi mused.
There’s no other way to prepare people for AI agents than to give them devices packing the technology. And at long last, phones may finally be capable of supporting them, partly because of advancements in Qualcomm’s mobile chips.Â
Dr. Vinesh Sukumar, senior director of AI product management at Qualcomm, trumpeted the new Snapdragon 8 Elite Gen 5 chip for its AI advancements.
Qualcomm’s chips: Advancing silicon improves AI agents
Two years ago, the company’s Snapdragon 8 Gen 3 could handle 15 tokens per second, a metric for the speed AI can process requests. The new Snapdragon 8 Elite Gen 5 can handle up to 220 tokens per second. In short, this should allow AI assistants to process requests faster and consult more data for better responses.
“The difference between 150 tokens, which was last year, to 220 tokens this year has several seconds of latency,” said Dr. Vinesh Sukumar, senior director of AI product management at Qualcomm. “Anything beyond a second impacts the human-machine interface. We believe that’s not good, because at some point of time, the user loses interest.”
In the last two years, Qualcomm switched its mobile chip from Arm-based Kryo to its own Oryon central processing units, and boosted graphical capability and neural performance. Even improvements over last year’s Snapdragon 8 Elite — 37% better CPU performance — enable AI capabilities that are “actually a perceptible difference that we’ll see,” noted Malladi.
“We’ll see a lot more always-on capabilities that will actually help you,” Malladi said.Â
The company’s new class of Oryon CPU debuted on its Snapdragon X Elite chip for PCs, which made its way into laptop PCs starting in June 2024. With the latest Snapdragon X2 Elite and X2 Elite Extreme chips unveiled at this year’s Snapdragon Summit, Malladi pointed to a new AI benchmark — trillions of operations per second — to highlight the upgrade. The new chip delivers 80 TOPS, nearly double the 45 TOPS achieved by the first X Elite.
Whether those AI capabilities, combined with performance improvements over last year’s chip, will herald an era of AI agents on PCs has yet to be seen. Malladi admits that the PC laptops with the older X Elite had the same look and feel of conventional computers despite their AI capabilities. But despite the impressive performance and battery life, at the end of the day, the UI didn’t change that much. Qualcomm aims for the X2 Elite and Elite Extreme’s performance improvements and TOPS increases to be enough for companies to take the leap in making their laptops transition to an AI agent interface.
In other words, the hardware is no longer the roadblock to creating AI agent experiences on PCs and mobile devices — it’s now up to manufacturers to integrate them. In these very early days, there’s hardly a consensus on what AI agents will do. Device makers are still figuring out how to take advantage of what Qualcomm’s chips can do. Much is still up in the air, from how many and what size LLMs will live on a device (or connect through the cloud) to how users will actually see and interact with AI agents.
Despite that uncertainty, Malladi thinks that the AI agent starting gun has been fired.
“Are we waiting for something? No, probably not. Agentic AI is here, but you have to see it to believe it,” Malladi said. “But if you see it, you’d be like, oh yeah, I get it now.”
The Humain Horizon Pro PC has an AI-focused homescreen.
The first AI agent PC revealed at Snapdragon SummitÂ
A day after Qualcomm unveiled its new mobile and PC chips at its Snapdragon Summit 2025 keynote, CEO Cristiano Amon sat down with Humain’s chief executive. In the bright Maui morning, sunlight spilling over the western mountains, both men focused on the silver laptop between them:Â Humain’s first PC with what they described an AI agent interface.
That interface was more like a browser window with eight prompt templates ranging from the specific (stock advice) to the general (a ChatGPT-like “ask anything”). In other words, it looked more like the generative AI assistants we’re already familiar with and less like the broader AI agent vision Qualcomm has been pitching at this and previous summits. Qualcomm’s goal is a simple: a centralized prompt window where people can ask questions, and the agent can search across apps and personal data to deliver answers. (Disclosure: Ziff Davis, CNET’s parent company, filed a lawsuit against OpenAI in April, alleging that the company infringed on Ziff Davis copyrights in training and operating its AI systems.)
If nothing else, it will be practice for people to start familiarizing themselves with an agent-forward interface rather than hunting for programs and answers themselves. Whether they’re typing in questions on their PC keyboard or voicing them on phones, people will have to adjust to a new paradigm of using devices.
AI agents will be used differently between PC and mobile platforms, Malladi predicts.
People will ask questions on phones, and the AI agent will run through one or more LLMs to get the answers. But the power constraints on a mobile device will limit its capabilities to integer-based processing, mainly since it’ll be run on-device. The LLMs are also limited to those with comparatively lower parameters in the single digits (1-, 3-, 5- or 7-billion parameter models are the most common, which means less precision).
On PCs, the agents will focus on cloud-based models, ones run off the device. These will likely be more appealing to enterprise applications and floating-point-based, rather than integer-based models. Ultimately, they’ll be used to enhance productivity for work tasks, like generating lines of code, for example.Â
“Half the time you’re basically writing the same code that you’ve written before, and then you unnecessarily introduce some bugs,” Malladi said. AI could generate a lot of that repeated code so that programmers could focus on the novel approaches and problems that need to be solved.
Read the full article here