Johannes Bechberger 11/5/2025

Running an LLM on an Android Phone

Read Original

This technical article details the creation of an open-source Android app that runs AI models, such as Google's Gemma, directly on a device using the MediaPipe API. It explains how to download and install models, test the app's web endpoints, and use it for tasks like on-device object detection and prompting LLMs, comparing results from different models.

Running an LLM on an Android Phone

Comments

No comments yet

Be the first to share your thoughts!

Browser Extension

Get instant access to AllDevBlogs from your browser

Top of the Week

2
Designing Design Systems
TkDodo Dominik Dorfmeister 2 votes
3
Introducing RSC Explorer
Dan Abramov 1 votes
5
Fragments Dec 11
Martin Fowler 1 votes
6
Adding Type Hints to my Blog
Daniel Feldroy 1 votes
7
Refactoring English: Month 12
Michael Lynch 1 votes
9