**Smartphones to Become Proactive Assistants with AI**
Arm, the chip designer behind 99% of smartphones, envisions a future where AI brings a new wave of breakthroughs to our handsets. The company has outlined this plan after the release of Llama 3.2, Meta’s first open-source models that process both images and text. Arm said the models run “seamlessly” on its compute platform.
The smaller, text-based LLMs — Llama 3.2 1B and 3B — are optimized for Arm-based mobile chips, delivering faster user experiences on smartphones. Processing more AI at the edge can also create energy and cost savings. These enhancements offer new opportunities to scale, allowing Arm to run more AI directly on smartphones.
By increasing the efficiencies of LLMs, Arm can accelerate innovation for developers. This can lead to faster innovations and the emergence of new mobile apps that perform tasks on the user’s behalf. LLMs will understand the user’s location, schedule, and preferences, automating routine tasks and providing personalized recommendations on-device.
Arm aims to accelerate this evolution, providing the foundation for AI everywhere. The company expects our phones to evolve from command and control tools to proactive assistants. With an ambitious timetable, Arm wants more than 100 billion Arm-based devices to be AI-ready by 2025.
Key benefits of this evolution include: