local-llm
Here are 364 public repositories matching this topic...
A Python project that deploys a Local RAG chatbot using Ollama API and vLLM API. Refines answers with internal RAG knowledge base, using both Embedding and Rerank models to improve accuracy of context provided to LLM models.
-
Updated
Sep 23, 2025 - Python
💻🔒 A local-first full-stack app to analyze medical PDFs with an AI model (Apollo2-2B), ensuring privacy & patient-friendly insights — no external APIs or cloud involved.
-
Updated
Sep 23, 2025 - Python
A fully offline NLP pipeline for extracting, chunking, embedding, querying, summarizing, and translating research documents using local LLMs. Inspired by the fictional mystery of Dr. X, the system supports multi-format files, local RAG-based Q&A, Arabic translation, and ROUGE-based summarization — all without cloud dependencies.
-
Updated
Sep 23, 2025 - Python
Python agent connecting local LLM and local SearxNG for web search
-
Updated
Sep 23, 2025 - Python
🔍 Establish unique identities for AI agents using Solana blockchain and NFTs, enhancing security and accountability in the digital landscape.
-
Updated
Sep 23, 2025
-
Updated
Sep 23, 2025 - Jupyter Notebook
🤖 Bootstrap a self-hosted AI and low code environment with Ollama, Open WebUI, and Supabase for seamless local LLM development and integration.
-
Updated
Sep 23, 2025 - Python
🚀 Extract and vectorize your Cursor chat history, enabling efficient search through a Dockerized FastAPI API with LanceDB integration.
-
Updated
Sep 23, 2025 - Python
QuietPrompt is a local-first AI tool for coding. Capture screen text, voice, or typed prompts and run them offline with your LLM; no cloud 🐙
-
Updated
Sep 23, 2025 - C#
AetherShell is an AI-driven Linux assistant that executes natural language commands offline using a local LLM. Ideal for seamless shell interaction. 🐙💻
-
Updated
Sep 23, 2025 - Python
Local Deep Research achieves ~95% on SimpleQA benchmark (tested with GPT-4.1-mini). Supports local and cloud LLMs (Ollama, Google, Anthropic, ...). Searches 10+ sources - arXiv, PubMed, web, and your private documents. Everything Local.
-
Updated
Sep 23, 2025 - Python
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
-
Updated
Sep 23, 2025 - JavaScript
Run Open Source/Open Weight LLMs locally with OpenAI compatible APIs
-
Updated
Sep 23, 2025 - Rust
A curated list of awesome platforms, tools, practices and resources that helps run LLMs locally
-
Updated
Sep 23, 2025
'afm' command cli: macOS server and single prompt mode that exposes Apple's Foundation Models through OpenAI-compatible API endpoints. Supports Apple Vision and single command (non-server) inference with piping as well
-
Updated
Sep 23, 2025 - Swift
💻一款简洁实用轻量级的本地AI对话客户端,采用Tauri2.0和Next.js编写 A simple, practical, and lightweight local AI chat client, written in Tauri 2.0 & Next.js.
-
Updated
Sep 23, 2025 - TypeScript
LM Studio MCP with Expert Prompts and Custom Prompting Capability
-
Updated
Sep 23, 2025 - TypeScript
Improve this page
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."