Stars
An open-source AI agent that brings the power of Gemini directly into your terminal.
Autonomous coding agent right in your IDE, capable of creating/editing files, executing commands, using the browser, and more with your permission every step of the way.
Jan is an open source alternative to ChatGPT that runs 100% offline on your computer.
🍒 Cherry Studio is a desktop client that supports for multiple LLM providers.
Perplexica is an AI-powered answering engine. It is an Open source alternative to Perplexity AI
Invoke is a leading creative engine for Stable Diffusion models, empowering professionals, artists, and enthusiasts to generate and create visual media using the latest AI-driven technologies. The …
The AI Browser Automation Framework
High-performance In-browser LLM Inference Engine
AGENTS.md — a simple, open format for guiding coding agents
Open source codebase powering the HuggingChat app
Copilot Chat extension for VS Code
Use Hugging Face with JavaScript
Cross-Platform, GPU Accelerated Whisper 🏎️
Run AI models locally on your machine with node.js bindings for llama.cpp. Enforce a JSON schema on the model output on the generation level
Example apps for the Apps SDK
Build, enrich, and transform datasets using AI models with no code
JavaScript Gaussian Splatting library.
osanseviero / InstantCoder
Forked from Nutlope/llamacoderCreate apps with Gemini
mini cli search engine for your docs, knowledge bases, meeting notes, whatever. Tracking current sota approaches while being all local
MCP Server to Use HuggingFace spaces, easy configuration and Claude Desktop mode.
An infinite canvas image editor using fal.ai
A lightweight express.js server implementing OpenAI’s Responses API, built on top of Chat Completions, powered by Hugging Face Inference Providers.
Run LLMs in the Browser with MLC / WebLLM ✨
A VSCode extension to use Hugging Face Inference Providers in Copilot Chat
Sample app to get started using the Video API with Sora
AI town https://github.com/a16z-infra/ai-town Patches to run on Hugging Face Spaces
Run LLMs in the Browser with MLC / WebLLM ✨