local-llm
Here are 17 public repositories matching this topic...
Nous: A privacy-focused personal knowledge assistant using local LLMs to securely interact with your documents and enhance information retrieval.
-
Updated
Sep 3, 2024 - Go
🤖 The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
-
Updated
Jan 7, 2025 - Go
Stop paying for AI APIs during development. LocalCloud runs everything locally - GPT-level models, databases, all free.
-
Updated
Jul 8, 2025 - Go
A high-performance task queue management system for Ollama models, built in Go.
-
Updated
Aug 24, 2025 - Go
Read PGS (Bluray) and VobSub (DVD) image subtitles and extract their text using external Vision Language Models.
-
Updated
Oct 31, 2025 - Go
AI-Native Autoscaler for Docker Compose — built with cagent + MCP + Model Runner.
-
Updated
Nov 1, 2025 - Go
Kubernetes operator for GPU-accelerated LLM inference - air-gapped, edge-native, production-ready
-
Updated
Dec 8, 2025 - Go
Local agentic CLI coding assistant
-
Updated
Dec 7, 2025 - Go
🚀 Enterprise-grade AI coding assistant with local AI processing, GitHub integration, and web scraping capabilities. Built with Go and powered by Ollama.
-
Updated
Dec 9, 2025 - Go
EduSphere is an AI-powered academic assistant that turns raw transcripts into personalized insights, course paths, and scholarships — powered by local LLM inference. Built with a Golang Fiber backend and React (Vite) frontend, it integrates Generative AI reasoning, natural language interaction, and real-world data into a cohesive, production-grade.
-
Updated
Dec 13, 2025 - Go
Local LLM proxy, DevOps friendly
-
Updated
Dec 14, 2025 - Go
Improve this page
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."