Get Started
This guide will help you set up Tracy in your Kotlin project and create your first trace.
Requirements
- Kotlin: 2.0.0 through 2.3.0
- Java: 17+
- OpenTelemetry (if already set up in your project): 1.2+
Supported LLM Client SDKs
- OpenAI SDK
1.*–4.* - Anthropic SDK
1.*–2.* - Gemini SDK
1.8.*–1.38.*
Installation
1. Configure Repositories
Add the Tracy Maven repository to your project.
settings.gradle.kts
build.gradle.kts
Add the Tracy Maven repository to your project.
settings.gradle
build.gradle
Note: No additional repository configuration is required for Maven. Tracy artifacts are published to Maven Central, which is used by default.
2. Apply the Plugin and Add Dependencies
build.gradle.kts
plugins {
id("org.jetbrains.ai.tracy") version "0.1.0"
}
dependencies {
// Core module (required)
implementation("org.jetbrains.ai.tracy:tracy-core:0.1.0")
// Client-specific auto-tracing (add the ones you need)
implementation("org.jetbrains.ai.tracy:tracy-openai:0.1.0")
implementation("org.jetbrains.ai.tracy:tracy-anthropic:0.1.0")
implementation("org.jetbrains.ai.tracy:tracy-gemini:0.1.0")
implementation("org.jetbrains.ai.tracy:tracy-ktor:0.1.0")
}
build.gradle
plugins {
id 'org.jetbrains.ai.tracy' version '0.1.0'
}
dependencies {
// Core module (required)
implementation 'org.jetbrains.ai.tracy:tracy-core:0.1.0'
// Client-specific auto-tracing (add the ones you need)
implementation 'org.jetbrains.ai.tracy:tracy-openai:0.1.0'
implementation 'org.jetbrains.ai.tracy:tracy-anthropic:0.1.0'
implementation 'org.jetbrains.ai.tracy:tracy-gemini:0.1.0'
implementation 'org.jetbrains.ai.tracy:tracy-ktor:0.1.0'
}
<plugins>
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<configuration>
<jvmTarget>19</jvmTarget>
</configuration>
<version>${kotlin.version}</version>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<!-- Match your Kotlin version (e.g., 2.1.0, 2.0.20) -->
<artifactId>tracy-compiler-plugin-2.1.0-jvm</artifactId>
<version>0.1.0</version>
</dependency>
</dependencies>
</plugin>
</plugins>
<dependencies>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-core-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<!-- Client-specific auto-tracing (add the ones you need) -->
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-openai-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-anthropic-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-gemini-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-ktor-jvm</artifactId>
<version>0.1.0</version>
</dependency>
</dependencies>
Quick Example
Here's a minimal example to verify your setup:
@Trace
fun greet(name: String) = println("Hello, $name!")
fun main() {
// Enable tracing via the `IS_TRACY_ENABLED` environment variable
// or programmatically, as shown below:
TracingManager.isTracingEnabled = true
// 1. Configure SDK with console exporter
val sdk = configureOpenTelemetrySdk(ConsoleExporterConfig())
// 2. Set SDK in TracingManager
TracingManager.setSdk(sdk)
// 3. Call a traced function
greet("Tracy")
// 4. Flush traces before exit
TracingManager.flushTraces()
}
This example uses:
@Trace: Annotation that enables automatic tracing for the functionconfigureOpenTelemetrySdk: Creates an OpenTelemetry SDK with the specified exporterConsoleExporterConfig: Configuration for exporting traces to the consoleTracingManager: Central point for configuring and controlling tracing
Run your application, and you'll see trace output in the console.
More Examples
For complete, runnable examples covering various Tracy features, see the examples on GitHub.
What Can You Trace?
Tracy provides three ways to add tracing to your application:
LLM Client Auto-Tracing
Automatically capture spans for all calls made via supported LLM clients (OpenAI, Anthropic, Gemini, Ktor, OkHttp).
Simply wrap your client with instrument():
// create an OpenAI client instance and instrument it
val instrumentedClient: OpenAIClient = OpenAIOkHttpClient.builder()
.apiKey(apiKey)
.build()
.apply { instrument(this) }
Learn more about LLM auto-tracing
Annotation-Based Tracing
Use the @Trace annotation to trace
any Kotlin function, capturing its inputs, outputs, and duration:
Learn more about annotation-based tracing
Manual Tracing
For fine-grained control or Java interoperability, use the withSpan function:
Learn more about manual tracing
Next Steps
- Configure Exporters: Send traces to Langfuse, Weave, Jaeger, and more
- Tracing API Overview: Deep dive into tracing concepts
- Sensitive Content: Control what data is captured in traces