Skip to content
#

local-inference

Here are 84 public repositories matching this topic...

Mano-P: Open-source GUI-VLA agent for edge devices. #1 on OSWorld (specialized, 58.2%). Runs locally on Apple M4 Mac mini/MacBook — no data leaves your device.Mano-P 是一个开源 GUI-VLA 项目,支持在 Mac mini/MacBook 上或通过算力棒本地运行推理,实现纯视觉驱动的跨平台 GUI 自动化操作。数据完全本地处理,支持复杂多步骤任务规划与执行。

  • Updated Apr 20, 2026

Modern desktop application (Rust + Tauri v2 + Svelte 5 + Candle (HF)) for communicating with AI models that runs completely locally on your computer. No subscriptions, no data sent to the internet — just you and your personal AI assistant

  • Updated Mar 24, 2026
  • Rust

MindSpark: ThoughtForge — A rune-forged conversation engine by RuneForgeAI, built for tiny GPT-Nothing-class minds. Through guided memory, lean cognition, and relentless refinement, it gives small local models depth, presence, and will—bringing powerful AI to edge devices, low-power hardware, and the Third Path beyond bloated machine empires.

  • Updated Apr 24, 2026
  • Python

Improve this page

Add a description, image, and links to the local-inference topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the local-inference topic, visit your repo's landing page and select "manage topics."

Learn more