ScreenSafe: Privacy Protection and on-device PII detection.
-
Updated
Nov 30, 2025 - TypeScript
ScreenSafe: Privacy Protection and on-device PII detection.
UNOFFICIAL Simple LM Studio Web UI (Docker)
🌌 Create a simple MCP Server using the Star Wars API to access characters, planets, and films efficiently for testing and integration purposes.
Graph RAG visualization platform with interactive knowledge graphs and real-time parameter tuning for local LLMs
A calm AI job‑search workspace. Calming workspace to prepare your profile, explore optimistic roles, and turn insights into confident action. Runs as a web app or packaged desktop app via Tauri. No server-side deps; keys entered in-app.
A Chrome extension that enhances Zendesk support workflows through intelligent AI-powered ticket analysis, providing support agents with streamlined, actionable insights.
A simple Retrieval-Augmented Generation system designed for sensitive documents. Uses Ollama for secure embedding and inference. All processing happens locally with no external dependencies.
Transform natural language questions into interactive dashboards using local LLMs
PDF Assistant provides tools to parse, extract, annotate, summarize, and query PDF documents. Supports OCR, split/merge, conversion and searchable exports to help build document workflows and automation.
chatgpt in your vscode
AI-powered, offline VS Code extension to auto-add comments/docstrings to Python code.
BrainDrive Plugin for managing your Ollama servers and models.
LMWebUI, privacy-focused Language Model Web UI for local Ollama models. Run language models locally with a beautiful web interface.
A multi-purpose document database using SQLite.
Workday Copilot: A privacy-focused Chrome extension that automates job applications on Workday using local LLMs. Built with WXT, it intelligently fills forms, manages resume data, and handles navigation - all while keeping your data on your machine.
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."