🌌 Create a simple MCP Server using the Star Wars API to access characters, planets, and films efficiently for testing and integration purposes.
-
Updated
Dec 13, 2025 - TypeScript
🌌 Create a simple MCP Server using the Star Wars API to access characters, planets, and films efficiently for testing and integration purposes.
A local-first, privacy-focused autonomous AI companion with Council architecture, long-term memory, Spotify, YouTube, image generation, email, calendar, and MCP integration
AI Coding Assistant CLI for offline enterprise environments - Local LLM platform with Plan & Execute architecture, Supervised Mode, and auto-update system
Personal AI Notebooks. Organize files & webpages and generate notes from them. Open source, local & open data, open model choice (incl. local).
Transform natural language questions into interactive dashboards using local LLMs
A local, privacy-first résumé builder using LLMs and Markdown to generate ATS-ready DOCX files with Pandoc — no cloud, no tracking.
PDF Assistant provides tools to parse, extract, annotate, summarize, and query PDF documents. Supports OCR, split/merge, conversion and searchable exports to help build document workflows and automation.
ScreenSafe: Privacy Protection and on-device PII detection.
A multi-purpose document database using SQLite.
Run Open Source/Open Weight LLMs locally with OpenAI compatible APIs
BrainDrive Plugin for managing your Ollama servers and models.
UNOFFICIAL Simple LM Studio Web UI (Docker)
Run AI conversations locally with Ollama, LMStudio OR Amazon Bedrock
💻一款简洁实用轻量级的本地AI对话客户端,采用Tauri2.0和Next.js编写 A simple, practical, and lightweight local AI chat client, written in Tauri 2.0 & Next.js.
Yet another (unofficial) Ollama GUI
🌌 Advanced AI Coding Assistant for VS Code - Local LLM powered development companion
🤖 Visual AI agent workflow automation platform with local LLM integration - build intelligent workflows using drag-and-drop interface, no cloud dependencies required.
Graph RAG visualization platform with interactive knowledge graphs and real-time parameter tuning for local LLMs
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."