Gemini takes another step toward direct control of other applications on Android phones. The feature was previewed by Google at the Google I/O 2025 event under the name Project Astra.
The proposal aims for the assistant to see the screen, scroll, and tap buttons to handle everyday tasks. The idea is to reduce user intervention in repetitive actions.

What Project Astra is and what Google showed
During Google I/O 2025, the company showed Gemini interacting with the phone in real time. The assistant not only interpreted text and images, but also executed actions.
This approach marks the leap toward AI agents. Instead of answering queries, Gemini starts to do things for the user.
Screen automation: this is what the feature would be called
In beta version 17.4.66 of the Google app for Android, new text strings appeared. The internal name "bonobo" and the term screen automation are mentioned there.
According to the description, Gemini will be able to place orders or book trips using apps installed on the phone. The associated link still points to a generic support page.
What tasks Gemini could perform on your phone
On-screen automation would be designed for simple actions. Examples include online shopping and transportation apps such as Uber or Lyft.











