Skip to content

traceroot-ai/traceroot

Repository files navigation

TraceRoot Logo

TraceRoot is an open-source observability platform for AI agents — Capture traces, debug with AI that sees your source code and Github history.

Y Combinator License X (Twitter) Discord Documentation PyPI SDK Downloads

Features

Agentic Debugging - Root Cause Analysis

Feature Description
Tracing Capture LLM calls, agent actions, and tool usage via OpenTelemetry-compatible SDK. Intelligently surfaces the traces that matter — noise filtered, signal prioritized.
Agentic Debugging AI that sees all your traces, connects to a sandbox with your production source code, identifies the exact failing line, and correlates the failure with your GitHub commits, PRs, and issues. BYOK support for any model provider.

Why TraceRoot?

  • Traces alone don't scale.

    As AI agent systems grow more complex, manually sifting through every trace is unsustainable. TraceRoot selectively screens your traces — filtering noise and surfacing only the ones that actually need attention, so you spend time fixing problems, not hunting for them.

  • Debugging AI agent systems is painful.

    Root-causing failures across agent hallucinations, tool call instabilities, and version changes is hard. TraceRoot's AI connects to a sandbox running your production source code, identifies the exact failing line, and cross-references your GitHub history — commits, PRs, open issues and creates PR to fix it.

  • Fully open source, no vendor lock-in.

    Both the observability platform and the AI debugging layer are open source. BYOK support for any model provider — OpenAI, Anthropic, Gemini, xAI, DeepSeek, OpenRouter, Kimi, GLM and more.

Documentation

Full documentation available at traceroot.ai/docs.

Getting Started

TraceRoot Cloud

The fastest way to get started. Ample storages and LLM tokens for testing, no credit card needed. Sign up here!

Self-Hosting

  • Developer mode: Run TraceRoot locally to contribute.

    # Get a copy of the latest repo
    git clone https://github.com/traceroot-ai/traceroot.git
    cd traceroot
    
    # Hosted the infras in docker and app itself locally
    make dev

    For more details, see CONTRIBUTING.md.

  • Local docker mode: Run TraceRoot locally to test.

    # Get a copy of the latest repo
    git clone https://github.com/traceroot-ai/traceroot.git
    cd traceroot
    
    # Hosted everything in docker
    make prod
  • Terraform (AWS): Run TraceRoot on k8s with Helm and Terraform. This is for production hosting. Still in experimental stage.

Integrations

Model Providers

Integration Supports Description
OpenAI Python, JS/TS Automated instrumentation of Chat Completions and Responses API.
Anthropic Python, JS/TS Automated instrumentation of the Messages API.
Google Gemini Python Automated instrumentation via the Google GenAI SDK.

Agent Frameworks

Integration Supports Description
LangChain & LangGraph Python, JS/TS Automated instrumentation by passing callback handler to LangChain application.
LangChain DeepAgents Python, JS/TS Automated instrumentation by passing callback handler to DeepAgents pipeline.
CrewAI Python Automated instrumentation of multi-agent collaborative workflows and task executions.
AutoGen Python Automated instrumentation of multi-agent conversations, agent loops, and tool calls.
LlamaIndex Python Automated instrumentation of RAG pipelines, document ingestion, retrieval, and LLM synthesis.
Mastra JS/TS Automated instrumentation via the TraceRoot OTLP exporter.

Don't see your framework or provider? Request an integration.

SDK

Language Repository
Python traceroot-py
TypeScript traceroot-ts

Python SDK Quickstart

pip install traceroot openai
import traceroot
from traceroot import Integration, observe
from openai import OpenAI

traceroot.initialize(integrations=[Integration.OPENAI])
client = OpenAI()

@observe(name="my_agent", type="agent")
def my_agent(query: str) -> str:
    response = client.chat.completions.create(
        model="gpt-4o",
        messages=[{"role": "user", "content": query}],
    )
    return response.choices[0].message.content

if __name__ == "__main__":
    my_agent("What's the weather in SF?")

TypeScript SDK Quickstart

npm install @traceroot-ai/traceroot openai
import OpenAI from 'openai';
import { TraceRoot, observe } from '@traceroot-ai/traceroot';

TraceRoot.initialize({ instrumentModules: { openAI: OpenAI } });
const openai = new OpenAI();

const myAgent = observe({ name: 'my_agent', type: 'agent' }, async (query: string) => {
  const response = await openai.chat.completions.create({
    model: 'gpt-4o',
    messages: [{ role: 'user', content: query }],
  });
  return response.choices[0].message.content;
});

async function main() {
  try {
    await myAgent("What's the weather in SF?");
  } finally {
    await TraceRoot.shutdown();
  }
}

main().catch(console.error);

Security & Privacy

Your data security and privacy are our top priorities. Learn more in our Security and Privacy documentation.

Community

Special Thanks for pi-mono project, which powers the foundation of our agentic debugging runtime!

Contributing 🤝: If you're interested in contributing, you can check out our guide here. All types of help are appreciated :)

Support 💬: If you need any type of support, we're typically most responsive on our Discord channel, but feel free to email us founders@traceroot.ai too!

License

This project is licensed under Apache 2.0 with additional Enterprise features.

Contributors