LLama.cpp rust bindings
-
Updated
Jun 27, 2024 - Rust
LLama.cpp rust bindings
A production-grade agentic chatbot server built in Rust with multi-provider LLM support, tool calling, RAG, MCP integration, and advanced research capabilities
Smart HTTP router for local LLMs (Ollama, LM Studio, llama.cpp). Rule-based + LLM-powered routing, health checks, load balancing, Prometheus metrics. Rust-native, zero-overhead.
Ask LLaMa about image in your clipboard
Let LLMs code on your repos without breaking stuff — sandboxed execution, atomic patches & full audit trail
Download, manage, and chat with LLMs, completely private and local.
Add a description, image, and links to the llama-cpp topic page so that developers can more easily learn about it.
To associate your repository with the llama-cpp topic, visit your repo's landing page and select "manage topics."