TensorZero is an open-source LLMOps platform that unifies an LLM gateway, observability, evaluation, optimization, and experimentation.
-
Updated
Mar 23, 2026 - Rust
TensorZero is an open-source LLMOps platform that unifies an LLM gateway, observability, evaluation, optimization, and experimentation.
⚙️🦀 Build modular and scalable LLM Applications in Rust
Plano is an AI-native proxy and data plane for agentic apps — with built-in orchestration, safety, observability, and smart LLM routing so you stay focused on your agents core logic.
Bionic is an on-premise replacement for ChatGPT, offering the advantages of Generative AI while maintaining strict data confidentiality
AICI: Prompts as (Wasm) Programs
Open-source LLM load balancer and serving platform for self-hosting LLMs at scale 🏓🦙 Alternative to projects like llm-d, Docker Model Runner, etc but with less moving parts and simple deployments built around ggml ecosystem. Runs on CPU and GPU.
Scalable, fast, and disk-friendly vector search in Postgres, the successor of pgvecto.rs.
High-scale LLM gateway, written in Rust. OpenTelemetry-based observability included
Simple, Composable, High-Performance, Safe and Web3 Friendly AI Agents and LazAI Gateway for Everyone
Govern & Secure your AI
🧬 The adaptive model routing system for exploration and exploitation.
A production-ready framework for composing AI agents from declarative TOML configuration, with MCP tool integration, RAG pipelines, and an OpenAI-compatible web API.
Code intelligence for humans, machines, and LLMs: receipts, metrics, and insights from your codebase.
Zero-code LLM security & observability proxy. Real-time prompt injection detection, PII scanning, and cost control for OpenAI-compatible APIs. Built in Rust.
Robot VLM and VLA (Vision-Language-Action) inference API helping you manage multimodal prompts, RAG, and location metadata
Burgonet Gateway is an enterprise LLM gateway that provides secure access and compliance controls for AI systems
Add a description, image, and links to the llmops topic page so that developers can more easily learn about it.
To associate your repository with the llmops topic, visit your repo's landing page and select "manage topics."