ScreenSafe: Privacy Protection and on-device PII detection.
-
Updated
Nov 30, 2025 - TypeScript
ScreenSafe: Privacy Protection and on-device PII detection.
Vibecode Editor is a blazing-fast, AI-integrated web IDE built entirely in the browser using Next.js App Router, WebContainers, Monaco Editor, and local LLMs via Ollama. It offers real-time code execution, an AI-powered chat assistant, and support for multiple tech stacks — all wrapped in a stunning developer-first UI.
A multi-purpose document database using SQLite.
UNOFFICIAL Simple LM Studio Web UI (Docker)
Run AI conversations locally with Ollama, LMStudio OR Amazon Bedrock
A local-first, privacy-focused autonomous AI companion with Council architecture, long-term memory, Spotify, YouTube, image generation, email, calendar, and MCP integration
🌌 Create a simple MCP Server using the Star Wars API to access characters, planets, and films efficiently for testing and integration purposes.
A Chrome extension that enhances Zendesk support workflows through intelligent AI-powered ticket analysis, providing support agents with streamlined, actionable insights.
An AI powered mobile app for disaster preparedness and mental welbeing
AI Coding Assistant CLI for offline enterprise environments - Local LLM platform with Plan & Execute architecture, Supervised Mode, and auto-update system
Graph RAG visualization platform with interactive knowledge graphs and real-time parameter tuning for local LLMs
A calm AI job‑search workspace. Calming workspace to prepare your profile, explore optimistic roles, and turn insights into confident action. Runs as a web app or packaged desktop app via Tauri. No server-side deps; keys entered in-app.
CLI tool for evaluating and comparing AI models across Google, Ollama, OpenRouter, LM Studio, and GitHub Models. Features robust error handling, cost tracking, memory-augmented chat, and dynamic test coverage.
The Script Summarizer AI App is a cross platform desktop (macOS, Windows, Linux) software that provides film directors with a secure, local environment for analyzing scripts using Large Language Models *offline*.
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."