User Metrics
We support tracking user metrics to judge real-life performance of your AI agent. For a coding agent, these metrics could include number of code change acceptances. You can record any custom metric, that you want to track using thetrace.log_metric() function.
These metrics will show up on the dashboard on each trace and an aggregation of it on the
main dashboard.
Log metrics
basic_metrics.py
Attribute to conversations and users
Wrap your operations in a conversation context to consistently associate metrics with a conversation and a specific user. Theuser_id should be your application’s stable identifier (for example, request.user.id).
attribute_to_user.py
- Use a stable
user_id(e.g.,request.user.id) to attribute metrics and traces to real users of your agent. - In async apps (e.g., FastAPI), set
trace_across_async_contexts=Trueon yourTracerand onwrap(...)so conversation context (and thus user attribution) propagates correctly.
Example: FastAPI (metrics only excerpt)
fastapi_metrics.py
Best practices
- Name metrics clearly: e.g.,
tool_latency_ms,retrieval_hits,tokens_prompt. - Keep value types numeric; use
tagsandpropertiesfor context. - Avoid high-cardinality tag values (e.g., entire prompts) to keep analytics fast.
- Log only what you’ll monitor or analyze later (alerting, funnels, cohort analysis).

