Skip to content
#

on-device-ai

Here are 18 public repositories matching this topic...

On-device AI SDK powering ToolNeuron — LLM chat & tool calling (llama.cpp), Stable Diffusion image generation (QNN/MNN), image processing (upscale, segment, inpaint, depth, style), and TTS. Native C++ + Kotlin JNI. Fork or clone to use in your own app.

  • Updated Mar 6, 2026
  • C++

Run powerful AI models entirely on your Android device. 100% offline, private, no API keys. Built with Kotlin, Jetpack Compose, Material 3, and llama.cpp. Download GGUF models from Hugging Face. On-device LLM inference for Android.

  • Updated Mar 12, 2026
  • C++

onenm_local_llm is a Flutter plugin that simplifies on-device language model inference on Android using llama.cpp. It removes the complexity of setting up native runtimes, model loading, and inference pipelines, so developers can integrate local AI into their apps through a simple API.

  • Updated Mar 19, 2026
  • C++

Improve this page

Add a description, image, and links to the on-device-ai topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the on-device-ai topic, visit your repo's landing page and select "manage topics."

Learn more