Arm: AI will turn smartphones into ‘proactive assistants’

The British chip giant supported the development of Meta's latest AI models


Arm: AI will turn smartphones into ‘proactive assistants’

Arm wants to upgrade the brains inside our devices. The chip designer — whose architectures power 99% of smartphones — envisions AI bringing a new wave of breakthroughs to our handsets.

The company outlined this plan after the release of Llama 3.2 — Meta’s first open-source models that processes both images and text. Arm said the models run “seamlessly” on its compute platforms.

The smaller, text-based LLMs — Llama 3.2 1B and 3B — are optimised for Arm-based mobile chips. Consequently, the models can deliver faster user experiences on smartphones. Processing more AI at the edge can also save energy and costs.

These enhancements offer new opportunities to scale. By increasing the efficiencies of LLMs, Arm can run more AI directly on smartphones. For developers, that could lead to faster innovations.

The

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

Arm expects endless new mobile apps to emerge as a result.

LLMs will perform tasks on your behalf by understanding your location, schedule, and preferences. Routine tasks will be automated and recommendations personalised on-device. Your phone will evolve from a command and control tool to a “proactive assistant.”

Arm aims to accelerate this evolution. The UK-based business wants its CPUs to provide “the foundation for AI everywhere.”

Arm has an ambitious timetable for this strategy. By 2025, the chip giant wants more than 100 billion Arm-based devices to be “AI ready.”

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with


Published
Back to top