Running an LLM on an Android Phone
Read OriginalThis technical article details the creation of an open-source Android app that runs AI models, such as Google's Gemma, directly on a device using the MediaPipe API. It explains how to download and install models, test the app's web endpoints, and use it for tasks like on-device object detection and prompting LLMs, comparing results from different models.
Comments
No comments yet
Be the first to share your thoughts!
Browser Extension
Get instant access to AllDevBlogs from your browser