The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
-
Updated
Dec 19, 2025 - JavaScript
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
Chat with your notes & see links to related content with AI embeddings. Use local models or 100+ via APIs like Claude, Gemini, ChatGPT & Llama 3
A 🇨🇳 and 🇺🇸 translator in your command line
A Node.js CLI that uses Ollama and LM Studio models (Llava, Gemma, Llama etc.) to intelligently rename files by their contents
AWS AI Stack – A ready-to-use, full-stack boilerplate project for building serverless AI applications on AWS
Edgex is a Gen Z-powered AI mentorship movement — not just code. We help students navigate emotional chaos and career confusion with smart, empathetic AI. Built with passion, heart, and vision. 💬🎯 Join the mission. Be the reason someone finds clarity.
LLM Chat is an open-source serverless alternative to ChatGPT.
💻 Hackable UI Generator with LLMs. Build quick MVP UIs with HTML, Tailwind, Font Awesome, Placehold.co and Groq for superfast generation⚡
CVortex, the AI based resume analyzing and resume building tool!
Prospera is an AI-powered budgeting app that helps users manage their finances by analyzing spending patterns and providing personalized recommendations. It integrates real-time financial data and features a secure chatbot for quick financial insights.
🔀 Bedrock Proxy Endpoint ⇢ Spin up your own custom OpenAI API server endpoint for easy AWS Bedrock inference (using standard baseUrl, and apiKey params)
[Not working now] A free llama3.1 405B API server that routes any OpenAI compatible chat API request to Sambanova AI. Deploy with Cloudflare Worker, Docker, or NodeJS
User-friendly LLM chat application that works with the OpenRouter API - currently only works with predefined free models
Distress Support Chatbot Using llama3-8b
Add a description, image, and links to the llama3 topic page so that developers can more easily learn about it.
To associate your repository with the llama3 topic, visit your repo's landing page and select "manage topics."