Skip to main content

Tracing

trajectory lets you trace functions, tools, and LLM calls with minimal changes to your codebase.

Create a Tracer

tracer.py
import os
from trajectory import Tracer, wrap
from openai import OpenAI

tracer = Tracer(
    api_key=os.getenv("TRAJECTORY_API_KEY") or os.getenv("JUDGMENT_API_KEY"),    
)

#Wrap your LLM client to be able to automatically track API calls
client = wrap(OpenAI(api_key=os.getenv("OPENAI_API_KEY")))

Trace tools and functions

Use @tracer.observe(span_type="...") to trace any callable. Mark tools with span_type="tool" and business logic with span_type="function".
tools.py
from trajectory import Tracer

tracer = Tracer(project_name="tools_demo")

@tracer.observe(span_type="tool")
def get_current_time() -> str:
    from datetime import datetime
    return datetime.utcnow().isoformat() + "Z"

@tracer.observe(span_type="tool")
def add_numbers(a: float, b: float) -> float:
    return a + b

@tracer.observe(span_type="function")
def format_question(q: str) -> str:
    return f"Question: {q}"

Create spans and log data

You will automatically create spans and can view it on your dashboard in the ‘traces’ section. If you are using async contexts like FastAPI check out the section on it’s integration. The integration is slightly different.

Examples

FastAPI Integration

Instrument a FastAPI chatbot with trajectory tracing.

LangGraph Integration

Trace LangGraph nodes and runs using JudgevalCallbackHandler.