The actual magic of AI occurs when a mannequin stops merely describing the world and begins interacting with it. One such interplay mechanism is tool-use: the flexibility to foretell and invoke perform calls, for instance to open apps or alter system settings.
Shifting tool-use on-device permits builders to construct interactions that reply immediately whereas remaining absolutely practical no matter connectivity. This permits, for example, a pure language voice assistant to immediately create a calendar entry or navigate to a vacation spot when you’re driving.
Nonetheless, bringing this degree of functionality to cell stays a formidable process. Conventional function-calling has traditionally required massive fashions with reminiscence footprints far exceeding cell {hardware} constraints. The actual engineering problem is to compress these fashions right into a cell footprint, whereas sustaining accuracy and with out draining the battery.
In the present day, we’re excited to announce a number of main updates to Google’s on-device AI showcase app Google AI Edge Gallery:
- Constructing on the cross-platform capabilities of Google AI Edge, we’re delighted to convey AI Edge Gallery to iOS along with Android, permitting builders to discover the identical high-performance, on-device AI use circumstances powered by Gemma and different open-weight fashions immediately inside the iOS ecosystem.
- We’ve introduced the out-of-the-box agentic experiences, Cellular Actions and Tiny Backyard, on to Google AI Edge Gallery, showcasing how Google’s environment friendly FunctionGemma mannequin interprets pure language immediately into perform calls on machine inside merely 270M parameters.
- Constructing on the state-of-the-art efficiency benchmarks not too long ago shared in our newest LiteRT announcement, we have now now built-in benchmarking as a characteristic immediately into the Google AI Edge Gallery app so you may measure and expertise LiteRT’s main CPU and GPU efficiency throughout your individual units.
Expertise Cellular Actions and Tiny Backyard
The Cellular Actions demo leverages FunctionGemma to reimagine assistant interplay as a totally offline functionality. It permits the mannequin to parse pure language instructions—comparable to “Present me the San Francisco airport on map,” “Create a calendar occasion for two:30 PM tomorrow for cooking class,” or “Activate the flashlight”—and determine the right OS software or app intent to execute the command.
The Tiny Backyard demo is an interactive mini-game that lets gamers handle a digital plot of land utilizing voice instructions. For instance, a command like “Plant sunflowers within the high row and water them” is decomposed by the mannequin into particular app capabilities (like plantCrop or waterCrop) focusing on grid coordinates. This demonstrates how Google’s compact 270M FunctionGemma mannequin can adapt to extremely particular, customized recreation or app logic immediately on a cell phone, with out requiring any server pings.
See Cellular Actions in Motion on Android!
You’ll be able to do that on Android and beginning in the present day additionally on iOS!
Now you’ve got seen these demo experiences in motion, you might need to adapt this method in your personal customized use circumstances. For that, you may fine-tune your individual model and implement perform calling within the app.
Google AI Edge Gallery – now on iOS too
Constructing on the cross-platform capabilities of Google AI Edge, we’re thrilled to convey the total expertise of our Android app to the iOS ecosystem with the launch of the Google AI Edge Gallery out there within the App Retailer. Now, iOS builders and fanatics can discover the identical wealthy, on-device options, together with multi-turn AI Chat, Ask Picture queries, and Audio Scribe for native transcription. Most significantly, the iOS app contains our agentic demonstrations, Cellular Actions and Tiny Backyard, showcasing how subtle tool-calling and function-calling can carry out seamlessly on Apple {hardware}. By leveraging the unified energy of the Google AI Edge stack, we’re making certain that the perfect of on-device efficiency, privateness, and offline reliability is accessible to everybody, no matter their cell platform.
See Cellular Actions in Motion on iOS!
Check mannequin efficiency in app
Need to see this pace in motion? Now you can benchmark these fashions immediately inside the Gallery app throughout your individual units (out there on Android; coming to iOS quickly).
Utilizing Cellular Actions for example, the efficiency is blazingly quick on CPU—clocking in at 1916 tokens/sec (prefill) and 142 tokens/sec (decode) on a Pixel 7 Professional.
Right here is tips on how to run your individual benchmarking exams:
- Open the Menu: Faucet the hamburger icon within the top-left nook of the Gallery app.
- Choose Fashions: Faucet the Fashions tile to see the total listing of downloadable fashions.
- Benchmark: Hit the benchmark button and experiment with configurations—alter prefill/decode tokens or the variety of runs—to see precisely how FunctionGemma performs in your particular {hardware}.
Strive it now on Android and see how the mannequin performs in your cellphone!
Get began in the present day
Able to construct your first native agent? Right here is how one can dive in:
- Discover the demos: Obtain the Google AI Edge Gallery app (Android and iOS) to see Cellular Actions and Tiny Backyard in motion.
- Construct your individual: Use the fine-tuning recipes to adapt FunctionGemma to your particular app logic and customise AI Edge Gallery along with your personal capabilities.
- Be part of the dialog: We’re continuously iterating. Take a look at the AI Edge Gallery on GitHub to comply with our progress, report points, or contribute to the way forward for on-device AI use circumstances!
We are able to’t wait to see the agentic options you’ll convey to life. Completely satisfied coding!
Acknowledgements
We might like to increase a particular due to our key contributors for his or her foundational work on this challenge: Francesco Visin, Hriday Chhabria, Jiageng Zhang, Jing Jin, Kat Black, Marissa Ikonomidis, Matthew Chan, Ravin Kumar, Rishika Sinha, Sahil Dua, Xu Chen, Na Li, Yinghao Solar, Yishuang Pang
We additionally gratefully acknowledge the important contributions from the next crew members: Byungchul Kim, Deepak Nagaraj Halliyavar, Fengwu Yao, Jae Yoo, Jenn Lee, Sahil Dua, Weiyi Wang, Xiaoming Hu, Yasir Modak, Yi-Chun Kuo, Yu-hui Chen, Zhe Chen
This effort was made potential by the steering and help from our management: Cormac Brick, Kathleen Kenealy, Matthias Grundmann, Ram Iyengar , Sachin Kotwani






