Skip to main content
Google

Google Turns Every Android App into an AI Agent, Launches “Always On Memory” Tool on

Written by
Renn Alvarado
AI News
Google Turns Every Android App into an AI Agent, Launches “Always On Memory” Tool on

Photo by Greg Bulla (unsplash.com/@gregbulla) on Unsplash

Google unveiled AppFunctions, a framework that turns every Android app into an AI‑driven agent and introduced an “Always On Memory” tool, letting users command apps like Uber via Gemini instead of manual navigation, according to a recent report.

Key Facts

  • Key company: Google

Google’s AppFunctions framework extends the Model Context Protocol (MCP) used in Google Cloud to the mobile sphere, allowing Android developers to annotate methods with an @appfunction decorator that registers them as callable “tools” for Gemini agents. When a user issues a natural‑language command—“order a coffee from Starbucks” or “set a reminder for tomorrow at 9 am”—Gemini discovers the matching function via a /tools/list endpoint and invokes it with structured arguments through /tools/call, bypassing the app’s UI entirely. The approach mirrors Google’s existing cloud‑side tool‑exposure model, as noted in the PolicyLayer report, and is already live on Pixel 10 and Galaxy S26 devices [PolicyLayer].

For apps that have not adopted the annotation API, Google supplies a fallback UI‑automation layer that lets Gemini drive the app’s graphical interface autonomously. The agent records the screen, identifies tappable elements, and simulates user gestures to complete tasks such as filling out a ride‑hailing form in Uber. This “UI‑automation” path eliminates the need for developers to retrofit legacy code, but it also reproduces the security concerns that have plagued MCP’s server‑side tool calls, according to the same PolicyLayer analysis. Because the agent can execute arbitrary UI actions, the risk of malicious or unintended behavior hinges on the device’s permission model and the vetting of the underlying Gemini model.

The accompanying “Always On Memory” agent, released on GitHub, provides a lightweight on‑device memory store that can be used by small organizations to avoid provisioning a separate vector database for context retrieval [GitHub]. The repository includes a Python wrapper that persists recent interaction embeddings in a local SQLite file and exposes them to Gemini via the /agents/call API. While the code is designed for Gemini 3.1 Flash‑Lite, the readme does not prescribe a mandatory cloud endpoint, leaving open the possibility of swapping the remote Gemini API for a locally hosted model. However, the current implementation relies on the Gemini client library to handle token limits and streaming, and no documented interface exists for plugging in an arbitrary LLM without modifying the core wrapper.

Gemini 3.1 Flash‑Lite itself supports a 1 million‑token incoming context and a 64 K‑token output window, which is sufficient for most on‑device memory use cases. If developers replace the remote API with a local model that respects the same token budget, the “Always On Memory” agent could continue to function, albeit with reduced latency and without network exposure. The trade‑off is that the local model must be capable of handling the same schema‑driven function calls and maintain the security sandbox that Google’s cloud service enforces. As of now, no official guidance or benchmark has been published to confirm that a self‑hosted model can meet these requirements, and the GitHub issue tracker shows only speculative discussion about such modifications.

Overall, AppFunctions and the Always On Memory agent represent a significant step toward making Android a first‑class platform for generative‑AI‑driven automation. By exposing app capabilities as structured tools and providing a fallback UI‑automation path, Google reduces friction for developers and end‑users alike. Yet the security model remains a critical open question, especially for the UI‑automation fallback that could be abused if not tightly controlled. The community will likely see rapid iteration on both the on‑device memory store and the integration of local LLMs, as small enterprises test the limits of a fully offline AI assistant on Android devices.

Sources

Primary source

No primary source found (coverage-based)

Other signals
  • Dev.to AI Tag
  • Reddit - r/LocalLLaMA New

This article was created using AI technology and reviewed by the SectorHQ editorial team for accuracy and quality.

More from SectorHQ:📊Intelligence📝Blog
About the author
Renn Alvarado
AI News

🏢Companies in This Story

Related Stories