Production-ready RAG system starter kit with local LLM inference, hybrid search, and intelligent document processing - deploy AI that learns from your knowledge base in minutes.
-
Updated
Nov 11, 2025 - Python
Production-ready RAG system starter kit with local LLM inference, hybrid search, and intelligent document processing - deploy AI that learns from your knowledge base in minutes.
PHP Frontend for Hosting local LLM's (run via VSCode or basic php execution methods/ add to project)
Langer is a lightweight desktop tool for translating text with multiple LLMs and evaluating them using standard metrics. It provides an easy Python/Tkinter interface, JSON batch translation, plugin-based evaluators, and support for both cloud and local LLMs.
JV-Archon is my personal offline LLM ecosystem.
Desktop AI chat that runs 100% locally. Private, personalized, with dedicated support and industry-specific SLMs. Private desktop AI assistant powered by Ollama. Features RAG, 30+ models, and specialized SLMs for enterprise.
ScreenSafe: Privacy Protection and on-device PII detection.
An automated system designed to replace manual Excel tracking. It securely connects to Gmail via OAuth2 and uses a Local LLM to analyze recruiter emails and update application statuses without sending data to third parties. Built with FastAPI and asyncio to handle concurrent email processing efficiently.
Video Summarizer using Local LLM (facebook/bart-large-cnn). Submit a YouTube URL and get an AI-generated summary of the video.
A high-performance task queue management system for Ollama models, built in Go.
Chatbot UI powered by local LLaMA3 using Ollama + Streamlit. 100% offline, no API keys needed.
An AI chatbot built with DeepScaleR and deployed locally using Ollama, FastAPI, and Gradio.
A toolkit of local, privacy-focused AI applications built with Python. Includes a RAG-powered research assistant for PDFs, various chatbots, and visualization scripts.
Meeting Mate is a local tool for transcribing and summarizing meetings conducted in Norwegian.
🤖 The free, Open Source alternative to OpenAI, Claude and others. Self-hosted and local-first. Drop-in replacement for OpenAI, running on consumer-grade hardware. No GPU required. Runs gguf, transformers, diffusers and many more models architectures. Features: Generate Text, Audio, Video, Images, Voice Cloning, Distributed, P2P inference
Free, offline OCR using local LLMs with Ollama. Convert images to text with vision-enabled models running entirely on your machine — no cloud, no API costs, full privacy.
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."