Skip to content

Tracing API

Tracy provides a comprehensive tracing API designed to capture the execution flow of AI-powered applications. It implements the OpenTelemetry Generative AI Semantic Conventions, ensuring that your traces are compatible with industry standards and various observability backends.

Core Concepts

The Tracing API is divided into three main categories:

  1. LLM Client Autotracing: Automatically capture spans for all calls made via supported LLM clients (OpenAI, Anthropic, Gemini, etc.).
  2. Function Tracing (Annotation-based): Use the @Trace annotation to trace any Kotlin function, capturing its inputs, outputs, and duration.
  3. Manual Tracing: Manually create and manage spans using the withSpan function for fine-grained control or for use in Java.

Key Components

  • TracingManager: The central point for configuring and controlling tracing at runtime.
  • instrument(): A function used to wrap LLM clients with tracing capabilities (multiple overloads for different LLM clients, e.g., see instrument for the OpenAIClient).
  • @Trace: An annotation for automatic instrumentation of Kotlin functions.
  • withSpan: A block-based API for manual span management.

See the following sections for more details: