Starred repositories
A context-aware AI assistant for your desktop. Ready to respond intelligently, seamlessly integrating multiple LLMs and MCP tools.
Capture system loopback audio on macOS 12.3+, Windows and Linux
The headless rich text editor framework for web artisans.
Copy-paste Liquid Glass shader with SVG
Next.js running in Electron: the best way to develop desktop apps
Accelerate local LLM inference and finetuning (LLaMA, Mistral, ChatGLM, Qwen, DeepSeek, Mixtral, Gemma, Phi, MiniCPM, Qwen-VL, MiniCPM-V, etc.) on Intel XPU (e.g., local PC with iGPU and NPU, discr…
A simple CLI tool for interacting with multiple remote Ollama servers, no Ollama installation required
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
MCP server for fetch web page content using Playwright headless browser.
A model context protocol server to work with JetBrains IDEs: IntelliJ, PyCharm, WebStorm, etc. Also, works with Android Studio
A high-performance image compression microservice based on MCP (Modal Context Protocol)
A Model Context Protocol server for Excel file manipulation
Model Context Protocol Servers
🐬DeepChat - A smart assistant that connects powerful AI to your personal world
A lightweight, powerful framework for multi-agent workflows
🍒 Cherry Studio is a desktop client that supports for multiple LLM providers.
📚 This is an adapted version of Jina AI's Reader for local deployment using Docker. Convert any URL to an LLM-friendly input with a simple prefix http://127.0.0.1:3000/https://website-to-scrape.com/
Convert any URL to an LLM-friendly input with a simple prefix https://r.jina.ai/
The Open-Source Multimodal AI Agent Stack: Connecting Cutting-Edge AI Models and Agent Infra
Vibe Workflow Platform for Non-technical Creators.
FastSend 是一个基于 WebRTC 技术的点对点文件传输工具,支持快速的目录同步和文件传输。通过浏览器即可实现安全、高效的文件共享。
Type-safe, self-documenting APIs for Next.js
The fastest way to develop full-stack web apps with React & Node.js.