Skip to content

future-agi/traceAI

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

traceAI Logo

traceAI

OpenTelemetry-native instrumentation for AI applications
Standardized observability across LLMs, agents, and frameworks

License Python TypeScript OpenTelemetry

Documentation β€’ Examples β€’ Slack β€’ PyPI Packages β€’ npm Packages


πŸš€ What is traceAI?

traceAI provides drop-in OpenTelemetry instrumentation for popular AI frameworks and LLM providers. It automatically captures traces, spans, and attributes from your AI workflowsβ€”whether you're using OpenAI, Anthropic, LangChain, LlamaIndex, or 20+ other frameworks.

  • Zero-config tracing for OpenAI, Anthropic, LangChain, LlamaIndex, and more
  • OpenTelemetry-native β€” works with any OTel-compatible backend (Jaeger, Datadog, Future AGI, etc.)
  • Semantic conventions for LLM calls, agents, tools, and retrieval
  • Python + TypeScript support with consistent APIs

Table of Contents

✨ Key Features

Feature Description
🎯 Standardized Tracing Maps AI workflows to consistent OpenTelemetry spans & attributes
πŸ”Œ Zero-Config Setup Drop-in instrumentation with minimal code changes
🌐 Multi-Framework 20+ integrations across Python & TypeScript
πŸ“Š Vendor Agnostic Works with any OpenTelemetry-compatible backend
πŸ” Rich Context Captures prompts, completions, tokens, model params, tool calls, and more
⚑ Production Ready Async support, streaming, error handling, and performance optimized

🎯 Quickstart

Python Quickstart

1. Install

pip install traceai-openai

2. Instrument your application

import os
from fi_instrumentation import register
from fi_instrumentation.fi_types import ProjectType
from traceai_openai import OpenAIInstrumentor
import openai

# Set up environment variables
os.environ["FI_API_KEY"] = "<your-api-key>"
os.environ["FI_SECRET_KEY"] = "<your-secret-key>"
os.environ["OPENAI_API_KEY"] = "<your-openai-key>"

# Register tracer provider
trace_provider = register(
    project_type=ProjectType.OBSERVE,
    project_name="my_ai_app"
)

# Instrument OpenAI
OpenAIInstrumentor().instrument(tracer_provider=trace_provider)

# Use OpenAI as normal - tracing happens automatically!
response = openai.chat.completions.create(
    model="gpt-4.1",
    messages=[{"role": "user", "content": "Hello!"}]
)

πŸ’‘ Tip: Swap traceai-openai for any supported framework (e.g., traceai-langchain, traceai-anthropic)


TypeScript Quickstart

1. Install

npm install @traceai/openai @traceai/fi-core

2. Instrument your application

import { register, ProjectType } from "@traceai/fi-core";
import { OpenAIInstrumentation } from "@traceai/openai";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import OpenAI from "openai";

// Register tracer provider
const tracerProvider = register({
  projectName: "my_ai_app",
  projectType: ProjectType.OBSERVE,
});

// Register OpenAI instrumentation (before creating client!)
registerInstrumentations({
  tracerProvider,
  instrumentations: [new OpenAIInstrumentation()],
});

// Use OpenAI as normal - tracing happens automatically!
const openai = new OpenAI({ apiKey: process.env.OPENAI_API_KEY });

const response = await openai.chat.completions.create({
  model: "gpt-4.1",
  messages: [{ role: "user", content: "Hello!" }],
});

πŸ’‘ Tip: Works with Anthropic, LangChain, Vercel AI SDK, and more TypeScript frameworks


πŸ“¦ Supported Frameworks

Python

Package Description Version
traceAI-openai traceAI Instrumentation for OpenAI. PyPI
traceAI-anthropic traceAI Instrumentation for Anthropic. PyPI
traceAI-llamaindex traceAI Instrumentation for LlamaIndex. PyPI
traceAI-langchain traceAI Instrumentation for LangChain. PyPI
traceAI-mistralai traceAI Instrumentation for MistralAI. PyPI
traceAI-vertexai traceAI Instrumentation for VertexAI. PyPI
traceAI-crewai traceAI Instrumentation for CrewAI. PyPI
traceAI-haystack traceAI Instrumentation for Haystack. PyPI
traceAI-litellm traceAI Instrumentation for liteLLM. PyPI
traceAI-groq traceAI Instrumentation for Groq. PyPI
traceAI-autogen traceAI Instrumentation for Autogen. PyPI
traceAI-guardrails traceAI Instrumentation for Guardrails. PyPI
traceAI-openai-agents traceAI Instrumentation for OpenAI Agents. PyPI
traceAI-smolagents traceAI Instrumentation for SmolAgents. PyPI
traceAI-dspy traceAI Instrumentation for DSPy. PyPI
traceAI-bedrock traceAI Instrumentation for AWS Bedrock. PyPI
traceAI-instructor traceAI Instrumentation for Instructor. PyPI
traceAI-google-genai traceAI Instrumentation for Google Generative AI. PyPI
traceAI-google-adk traceAI Instrumentation for Google ADK. PyPI
traceAI-pipecat traceAI Instrumentation for Pipecat. PyPI
traceAI-portkey traceAI Instrumentation for Portkey. PyPI
traceAI-mcp traceAI Instrumentation for Model Context Protocol. PyPI

TypeScript

Package Description Version
@traceai/openai traceAI Instrumentation for OpenAI. npm
@traceai/anthropic traceAI Instrumentation for Anthropic. npm
@traceai/langchain traceAI Instrumentation for LangChain. npm
@traceai/llamaindex traceAI Instrumentation for LlamaIndex. npm
@traceai/bedrock traceAI Instrumentation for AWS Bedrock. npm
@traceai/vercel traceAI Instrumentation for Vercel AI SDK. npm
@traceai/mastra traceAI Instrumentation for Mastra. npm
@traceai/mcp traceAI Instrumentation for Model Context Protocol. npm

πŸ”§ Compatibility Matrix

Category Supported Frameworks Python TypeScript
LLM Providers OpenAI βœ… βœ…
Anthropic βœ… βœ…
AWS Bedrock βœ… βœ…
Google Vertex AI βœ… -
Google Generative AI βœ… -
Mistral AI βœ… -
Groq βœ… -
LiteLLM βœ… -
Agent Frameworks LangChain βœ… βœ…
LlamaIndex βœ… βœ…
CrewAI βœ… -
AutoGen βœ… -
OpenAI Agents βœ… -
Smol Agents βœ… -
Mastra - βœ…
Tools & Libraries Haystack βœ… -
DSPy βœ… -
Guardrails AI βœ… -
Instructor βœ… -
Portkey βœ… -
Pipecat βœ… -
Vercel AI SDK - βœ…
Standards Model Context Protocol (MCP) βœ… βœ…

Legend: βœ… Supported | - Not yet available


πŸ—οΈ Architecture

traceAI is built on top of OpenTelemetry and follows standard OTel instrumentation patterns. This means you get:

πŸ”Œ Full OpenTelemetry Compatibility

  • Works with any OTel-compatible backend
  • Standard OTLP exporters (HTTP/gRPC)
  • Compatible with existing OTel setups

βš™οΈ Bring Your Own Configuration

You can use traceAI with your own OpenTelemetry setup:

Python: Custom TracerProvider & Exporters
from opentelemetry import trace
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from traceai_openai import OpenAIInstrumentor

# Set up your own tracer provider
tracer_provider = TracerProvider()
trace.set_tracer_provider(tracer_provider)

# Add custom exporters (example with Future AGI)
# HTTP endpoint
otlp_exporter = OTLPSpanExporter(
    endpoint="https://api.futureagi.com/tracer/v1/traces",
    headers={
        "X-API-KEY": "your-api-key",
        "X-SECRET-KEY": "your-secret-key"
    }
)
# Or use gRPC: OTLPSpanExporter(endpoint="grpc://grpc.futureagi.com:443", ...)
tracer_provider.add_span_processor(BatchSpanProcessor(otlp_exporter))

# Instrument with traceAI
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
TypeScript: Custom TracerProvider, Span Processors & Headers
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
import { Resource } from "@opentelemetry/resources";
import { registerInstrumentations } from "@opentelemetry/instrumentation";
import { OpenAIInstrumentation } from "@traceai/openai";

// Create custom tracer provider
const provider = new NodeTracerProvider({
  resource: new Resource({
    "service.name": "my-ai-service",
  }),
});

// Add custom OTLP exporter with headers (example with Future AGI)
// HTTP endpoint
const exporter = new OTLPTraceExporter({
  url: "https://api.futureagi.com/tracer/v1/traces",
  headers: {
    "X-API-KEY": process.env.FI_API_KEY!,
    "X-SECRET-KEY": process.env.FI_SECRET_KEY!,
  },
});
// Or use gRPC: new OTLPTraceExporter({ url: "grpc://grpc.futureagi.com:443", ... })

// Add span processor
provider.addSpanProcessor(new BatchSpanProcessor(exporter));
provider.register();

// Register traceAI instrumentation
registerInstrumentations({
  tracerProvider: provider,
  instrumentations: [new OpenAIInstrumentation()],
});

πŸ“Š What Gets Captured

traceAI automatically captures rich telemetry data:

  • Prompts & Completions: Full request/response content
  • Token Usage: Input, output, and total tokens
  • Model Parameters: Temperature, top_p, max_tokens, etc.
  • Tool Calls: Function/tool names, arguments, and results
  • Streaming: Individual chunks with delta tracking
  • Errors: Detailed error context and stack traces
  • Timing: Latency at each step of the AI workflow

All data follows OpenTelemetry Semantic Conventions for GenAI.

🀝 Contributing

We welcome contributions from the community!

πŸ“– Read our Contributing Guide for detailed instructions on:

  • Setting up your development environment (Python & TypeScript)
  • Running tests and code quality checks
  • Submitting pull requests
  • Adding new framework integrations

Quick Start:

  1. Fork the repository
  2. Create a feature branch (git checkout -b feature/amazing-feature)
  3. Commit your changes (git commit -m 'Add amazing feature')
  4. Push to the branch (git push origin feature/amazing-feature)
  5. Open a Pull Request

Found a bug? Please open an issue with:

  • Framework version
  • traceAI version
  • Minimal reproduction code
  • Expected vs actual behavior

Request a feature? Please open an issue with:

  • Use case or problem you're trying to solve
  • Proposed solution or feature description
  • Any relevant examples or mockups
  • Priority level (nice-to-have vs critical)

πŸ“š Resources

Resource Description
🌐 Website Learn more about Future AGI
πŸ“– Documentation Complete guides and API reference
πŸ‘¨β€πŸ³ Cookbooks Step-by-step implementation examples
πŸ“ Changelog All release notes and updates
🀝 Contributing Guide How to contribute to traceAI
πŸ’¬ Slack Join our community
πŸ› Issues Report bugs or request features

🌍 Connect With Us

Website LinkedIn Twitter Reddit Substack


Built with ❀️ by the Future AGI team

⭐ Star us on GitHub | πŸ› Report Bug | πŸ’‘ Request Feature

About

Open Source AI Tracing Framework built on Opentelemetry for AI Applications and Frameworks

Topics

Resources

License

Code of conduct

Contributing

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 7