Starred repositories
Define and run multi-container applications with Docker
Python Backtesting library for trading strategies
mcp-use is the easiest way to interact with mcp servers with custom agents
🍒 Cherry Studio is a desktop client that supports for multiple LLM providers.
Model Context Protocol Servers
Examples and demonstrations for using the Eino framework
Fully local web research and report writing assistant. This repo is a Typescript edition of the Ollama Deep Researcher.
SGLang is a fast serving framework for large language models and vision language models.
we want to create a repo to illustrate usage of transformers in chinese
AI Native Data App Development framework with AWEL(Agentic Workflow Expression Language) and Agents
通过 spring boot 调用 AnythingLLM 的API。
A trivial programmatic Llama 3 jailbreak. Sorry Zuck!
RAGFlow is a leading open-source Retrieval-Augmented Generation (RAG) engine that fuses cutting-edge RAG with Agent capabilities to create a superior context layer for LLMs
Unlock the power of secure object storage by seamlessly integrating MinIO with KES (Key Management Service) and HashiCorp Vault as the Key Management Service (KMS). This step-by-step guide provides…
A docker image for Stable Diffusion WebUI Forge
User-friendly AI Interface (Supports Ollama, OpenAI API, ...)
The all-in-one Desktop & Docker AI application with built-in RAG, AI agents, No-code agent builder, MCP compatibility, and more.
An AI knowledge base/agent built with .Net 9, AntBlazor, Semantic Kernel, and Kernel Memory, supporting local offline AI large models. It can run offline without an internet connection. Supports As…
A Docker Compose to run a local ChatGPT-like application using Ollama, Ollama Web UI, Mistral NeMo & DeepSeek R1.
A minimal LLM chat app that runs entirely in your browser
一款集合多家大模型能力的客户端。拥有丰富的个性化功能。现已支持:OpenAI,Ollama,谷歌 Gemini,讯飞星火,百度文心,阿里通义,天工,月之暗面,智谱,阶跃星辰,DeepSeek 🎉🎉🎉。A collection of large model capabilities of the client. Has a wealth of personalized functions. E…
UI-Tester for Interactions with Ollama via ollama4j