instrument
Instruments an OkHttpClient with OpenTelemetry tracing for LLM provider API calls, returning a cloned and patched instance of the provided OkHttpClient.
This function adds automatic tracing capabilities to an OkHttp client by injecting an OpenTelemetryOkHttpInterceptor configured with the provided LLMTracingAdapter. All HTTP requests made through the instrumented client to LLM provider APIs will be automatically traced, including request/response bodies, token usage, model parameters, tool calls, and errors. Supports both streaming and non-streaming requests.
This is a lower-level instrumentation function that works with raw OkHttp clients. For provider-specific clients (OpenAI, Anthropic, Gemini), consider using the provider-specific instrument() functions instead.
Use Cases
Basic OpenAI API Request
TracingManager.setSdk(configureOpenTelemetrySdk(ConsoleExporterConfig()))
TracingManager.traceSensitiveContent()
val apiToken = System.getenv("OPENAI_API_KEY")
val requestBodyJson = buildJsonObject {
put("model", JsonPrimitive("gpt-4o-mini"))
put("messages", buildJsonArray {
add(buildJsonObject {
put("role", JsonPrimitive("user"))
put("content", JsonPrimitive("Generate polite greeting and introduce yourself"))
})
})
put("temperature", JsonPrimitive(1.0))
}
val client = OkHttpClient()
val instrumentedClient = instrument(client, OpenAILLMTracingAdapter())
val requestBody = Json { prettyPrint = true }
.encodeToString(requestBodyJson)
.toRequestBody("application/json".toMediaType())
val request = Request.Builder()
.url("https://api.openai.com/v1/chat/completions")
.addHeader("Authorization", "Bearer $apiToken")
.addHeader("Content-Type", "application/json")
.post(requestBody)
.build()
instrumentedClient.newCall(request).execute().use { response ->
println("Result: ${response.body?.string()}")
}
TracingManager.flushTraces()Using with Different Providers
// OpenAI
val openAiClient = instrument(OkHttpClient(), OpenAILLMTracingAdapter())
// Anthropic
val anthropicClient = instrument(OkHttpClient(), AnthropicLLMTracingAdapter())
// Gemini
val geminiClient = instrument(OkHttpClient(), GeminiLLMTracingAdapter())Streaming Requests
val client = instrument(OkHttpClient(), OpenAILLMTracingAdapter())
val request = Request.Builder()
.url("https://api.openai.com/v1/chat/completions")
.addHeader("Authorization", "Bearer $apiToken")
.post(streamingRequestBody)
.build()
client.newCall(request).execute().use { response ->
response.body?.source()?.let { source ->
// Read streaming response
while (!source.exhausted()) {
val line = source.readUtf8Line()
// Process streaming data
}
}
}
// Streaming data is automatically captured and tracedNotes
This function is idempotent: calling
instrument()multiple times on the same client will not result in duplicate interceptors.Tracing can be controlled globally via
TracingManager.isTracingEnabled.The original client is not modified; a new client instance with instrumentation is returned.
Content capture policies TracingManager.contentCapturePolicy can be configured to redact sensitive data.
Error responses are automatically captured with error status and messages.
Return
A new OkHttpClient instance with OpenTelemetry tracing enabled (i.e., the initial client remains unmodified)
Parameters
The OkHttp client to instrument
The LLMTracingAdapter specifying which LLM provider adapter to use for tracing