Code with AI in VSCode, bring your own ai.
-
Updated
Sep 12, 2025 - TypeScript
Code with AI in VSCode, bring your own ai.
A local, privacy-first résumé builder using LLMs and Markdown to generate ATS-ready DOCX files with Pandoc — no cloud, no tracking.
Chrome Extension to Summarize or Chat with Web Pages/Local Documents Using locally running LLMs. Keep all of your data and conversations private. 🔐
Vibecode Editor is a fullstack, web-based IDE built with Next.js and Monaco Editor. It features real-time code execution using WebContainers, AI-powered code suggestions via locally running Ollama models, multi-stack templates, an integrated terminal, and a developer-focused UI for seamless coding in the browser.
💻一款简洁实用轻量级的本地AI对话客户端,采用Tauri2.0和Next.js编写 A simple, practical, and lightweight local AI chat client, written in Tauri 2.0 & Next.js.
A simple, locally hosted Web Search MCP server for use with Local LLMs
Run local LLM from Huggingface in React-Native or Expo using onnxruntime.
Chrome extension that summarizes web page contents using Gemini Nano. Features include secure in-browser summarization, markdown output, and translation options.
Open Source Local Data Analysis Assistant.
Workday Copilot: A privacy-focused Chrome extension that automates job applications on Workday using local LLMs. Built with WXT, it intelligently fills forms, manages resume data, and handles navigation - all while keeping your data on your machine.
Structured inference with Llama 2 in your browser
🤖 Visual AI agent workflow automation platform with local LLM integration - build intelligent workflows using drag-and-drop interface, no cloud dependencies required.
🌌 Advanced AI Coding Assistant for VS Code - Local LLM powered development companion
Unlicode is an open-source IDE with local LLMs — unlimited tokens, zero request caps, and complete privacy, so you can build without boundaries.
Yet another (unofficial) Ollama GUI
ScribePal is an Open Source intelligent browser extension that leverages AI to empower your web experience by providing contextual insights, efficient content summarization, and seamless interaction while you browse.
📒 A proof-of-concept app that transcribes lecture recordings into text and generates markdown academic notes using a local LLM
Vibecode Editor is a blazing-fast, AI-integrated web IDE built entirely in the browser using Next.js App Router, WebContainers, Monaco Editor, and local LLMs via Ollama. It offers real-time code execution, an AI-powered chat assistant, and support for multiple tech stacks — all wrapped in a stunning developer-first UI.
Add a description, image, and links to the local-llm topic page so that developers can more easily learn about it.
To associate your repository with the local-llm topic, visit your repo's landing page and select "manage topics."