Tracing API
Tracy provides a comprehensive tracing API designed to capture the execution flow of AI-powered applications. It implements the OpenTelemetry Generative AI Semantic Conventions, ensuring that your traces are compatible with industry standards and various observability backends.
Core Concepts
The Tracing API is divided into three main categories:
- LLM Client Autotracing: Automatically capture spans for all calls made via supported LLM clients (OpenAI, Anthropic, Gemini, etc.).
- Function Tracing (Annotation-based): Use the
@Traceannotation to trace any Kotlin function, capturing its inputs, outputs, and duration. - Manual Tracing: Manually create and manage spans using the
withSpanfunction for fine-grained control or for use in Java.
Key Components
TracingManager: The central point for configuring and controlling tracing at runtime.instrument(): A function used to wrap LLM clients with tracing capabilities (multiple overloads for different LLM clients, e.g., seeinstrumentfor theOpenAIClient).@Trace: An annotation for automatic instrumentation of Kotlin functions.withSpan: A block-based API for manual span management.
See the following sections for more details: