
Tracing, monitoring and evaluating AI features via a unified API that captures structured traces, follows OpenTelemetry Generative AI semantics, auto-instruments popular AI clients, and exports to observability backends.
Tracy helps you trace, monitor, and evaluate AI-powered features directly from your Kotlin or Java projects.
It provides a unified API to capture structured traces. Fully compatible with observability platforms like Langfuse and Weights & Biases (W&B).
[!Note] This project uses Tracy Official YouTrack Project for issue tracking. Please file bug reports and feature requests there. Issue templates and additional details are available in the Contributing Guidelines.
Standards:
This library implements
the OpenTelemetry Generative AI Semantic Conventions for span
attributes and event naming, ensuring your traces remain compatible with any OpenTelemetry-compliant backend.
You can use it to:
Select the build system that matches your setup:
Add dependencies to the build.gradle.kts file:
plugins {
id("org.jetbrains.ai.tracy") version "0.1.0"
}
dependencies {
// Core Module with Shared Functionalities
implementation("org.jetbrains.ai.tracy:tracy-core:0.1.0")
// Client-specific Auto Tracing (select the one you need)
implementation("org.jetbrains.ai.tracy:tracy-anthropic:0.1.0")
implementation("org.jetbrains.ai.tracy:tracy-gemini:0.1.0")
implementation("org.jetbrains.ai.tracy:tracy-ktor:0.1.0")
implementation("org.jetbrains.ai.tracy:tracy-openai:0.1.0")
}Make sure that you have mavenCentral() in
both your plugin management and project repositories sections.
repositories {
mavenCentral()
}pluginManagement {
repositories {
mavenCentral()
}
}Add dependencies to the build.gradle file:
plugins {
id 'org.jetbrains.ai.tracy' version '0.1.0'
}
dependencies {
implementation 'org.jetbrains.ai.tracy:tracy-core:0.1.0'
// Client-specific Auto Tracing
implementation 'org.jetbrains.ai.tracy:tracy-anthropic:0.1.0'
implementation 'org.jetbrains.ai.tracy:tracy-gemini:0.1.0'
implementation 'org.jetbrains.ai.tracy:tracy-ktor:0.1.0'
implementation 'org.jetbrains.ai.tracy:tracy-openai:0.1.0'
}Make sure that you have mavenCentral() in
both your plugin management and project repositories sections.
repositories {
mavenCentral()
}pluginManagement {
repositories {
mavenCentral()
}
}Add dependencies to the pom.xml file:
<project>
<!-- ... other project config (groupId, artifactId, version, etc.) ... -->
<build>
<plugins>
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<version>${kotlin.version}</version>
<configuration>
<jvmTarget>19</jvmTarget>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<!--Use the Kotlin version matching your setup (e.g., 2.0.0 or 2.0.20).
Keep the same major and minor numbers but adjust the patch version
to the nearest supported value - either 0 or 20.
Examples:
- 2.1.19 -> 2.1.0
- 2.1.23 -> 2.1.20 -->
<artifactId>tracy-compiler-plugin-2.1.0-jvm</artifactId>
<version>0.1.0</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-core-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<!-- Client-specific Auto Tracing -->
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-anthropic-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-gemini-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-ktor-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-openai-jvm</artifactId>
<version>0.1.0</version>
</dependency>
</dependencies>
</project>
Compatible with Kotlin from 2.0.0 and Java from 17.
If you already have OpenTelemetry set up in your project, versions 1.2+ are supported.
Supports integration with:
1.*–4.*
1.*–2.*
1.8.*–1.38.* (earlier versions unsupported)[!TIP] See the
examplesdirectory for complete and runnable examples!
Tracy provides tracing integrations for multiple clients—not only OpenAI, but also Gemini,
Anthropic,
Ktor, and OkHttp.
All clients can be instrumented in a similar way using the instrument(...) function.
According to OTEL semantic conventions, capturing of sensitive (e.g., user messages or assistant replies) data should be disabled by default (read more here). Therefore, by default Tracy disguises sensitive content by replacing it with a placeholder "REDACTED", preserving the trace structure and recorded attributes.
You may enable tracing of both input and output independently by following either of the ways below:
TRACY_CAPTURE_INPUT=true|false
TRACY_CAPTURE_OUTPUT=true|false// Call a helper method:
TracingManager.traceSensitiveContent()
// Or equivalently:
TracingManager.withCapturingPolicy(
ContentCapturePolicy(
captureInputs = true,
captureOutputs = true,
)
)Below is a minimal OpenAI example. For others, check the examples directory:
// Enable tracing via the `IS_TRACY_ENABLED` environment variable
// or programmatically, as shown below:
TracingManager.isTracingEnabled = true
// Initialize tracing and export spans to the console
val sdk: OpenTelemetrySdk = configureOpenTelemetrySdk(
exporterConfig = ConsoleExporterConfig()
)
TracingManager.setSdk(sdk)
// Permit tracing of sensitive content in requests AND responses
TracingManager.traceSensitiveContent()
// Create an OpenAI client and instrument it with tracing capabilities
val instrumentedClient: OpenAIClient = OpenAIOkHttpClient.builder()
.baseUrl(url)
.apiKey(apiKey)
.timeout(timeout)
.build()
.apply { instrument(this) }
// Make request and receive response
val request = ChatCompletionCreateParams.builder()
.addUserMessage("Generate a polite greeting and introduce yourself.")
.model(ChatModel.GPT_4O_MINI)
.temperature(0.0)
.build()
val response = instrumentedClient.chat().completions().create(request)
println("OpenAI response: ${response.choices().first().message().content().get()}")
// Traces are automatically flushed based on ExporterCommonSettings:
// - Periodically via flushIntervalMs and flushThreshold
// - On JVM shutdown if flushOnShutdown = true (default)
// For manual control, use TracingManager.flushTraces() This example uses simple console tracing for a demonstration. For information about other tracing backends, see Tracing Backends
Full example: OpenAI Client Auto Tracing Example
Runtime toggle: Spans are emitted only if both conditions are met:
TracingManager.setSdk(...)
Runtime tracing is controlled through TracingManager.isTracingEnabled, which defaults to the value of the
IS_TRACY_ENABLED environment variable. if the variable is not set, tracing is disabled by default. You can also
change this value programmatically at any time.Compile-time plugin toggle:
enableTracyPlugin. The property itself is optional.
If you don’t define it, the plugin is enabled by default, but you can turn it off whenever needed by setting
enableTracyPlugin=false in gradle.properties or by passing it on the command line with
-PenableTracyPlugin=false.Configuration:
Behavior:
TracingManager.isTracingEnabled.You can trace regular functions (not only client calls)
using the @Trace
annotation.
Make sure to apply the org.jetbrains.ai.tracy plugin in your build.
The Kotlin compiler plugin automatically instruments annotated functions, capturing execution details such as start and end time, duration, inputs, and outputs.
⚠️ Annotation-based tracing is supported only in Kotlin. For Java, use Manual Tracing instead.
@Trace(name = "GreetUser")
fun greetUser(name: String): String {
println("Hello, $name!")
return "Greeting sent to $name"
}
fun main() {
// Enable tracing via the `IS_TRACY_ENABLED` environment variable
// or programmatically, as shown below:
TracingManager.isTracingEnabled = true
TracingManager.setSdk(configureOpenTelemetrySdk(ConsoleExporterConfig()))
greetUser("Alice")
// Traces are automatically flushed based on ExporterCommonSettings:
// - Periodically via flushIntervalMs and flushThreshold
// - On JVM shutdown if flushOnShutdown = true (default)
// For manual control, use TracingManager.flushTraces()
}Nested calls between annotated functions are traced automatically.
When one traced function invokes another, a hierarchical trace structure is created the outer call is recorded as a
parent span and the inner call as its child span.
This structure provides a clear visual representation of call relationships and execution flow.
See the NestedSpansExample.kt for a
demonstration of hierarchical tracing.
Tracing annotation is automatically propagated through interfaces and class hierarchies.
If a method in an interface or superclass is annotated with
@Trace, all
overriding or implementing methods
inherit tracing behavior automatically, even if the annotation is not explicitly declared again.
This approach ensures consistent and reusable tracing across an entire inheritance chain without code duplication.
Refer to the
TracingPropagationExample.kt for a
complete example.
The tracing system offers flexible customization through the
SpanMetadataCustomizer
interface. Implement it as a Kotlin object (classes are not supported) and reference it in the
@Trace annotation.
With a custom SpanMetadataCustomizer, you can define:
See an example implementation in
MetadataCustomizerExample.kt.
You can enrich your traces with contextual metadata by adding custom tags.
Tags help categorize and filter traces based on your business logic. For example, by user type, feature, or
environment. Use the
addLangfuseTagsToCurrentTrace
function to
attach
tags dynamically within any traced function.
These tags appear in Langfuse or other tracing tools, making it easier to group and analyze trace data.
An example demonstrating how to provide custom tags can be found in
LangfuseTagExample.kt
In addition to annotation-based tracing, you can manually create and manage spans wherever you need them in your code.
This approach gives you full control over what is traced and what metadata is recorded.
Use the withSpan function to define a traceable block of code.
It automatically creates and closes a span around the specified code section, allowing you to attach structured metadata
such as inputs, outputs, or custom attributes.
Manual tracing is especially useful in Java projects where annotation-based tracing is not available.
An example demonstrating manual tracing can be found in
ManualTracingExample.kt.
Tracy provides integrations with multiple tracing backends, allowing you to choose the one that best fits your workflow. Currently supported backends include:
Create an SDK by providing the desired configuration, then initialize tracing with TracingManager.setSdk(sdk)
// Langfuse
val sdk = configureOpenTelemetrySdk(LangfuseExporterConfig())
// Weave
val sdk = configureOpenTelemetrySdk(WeaveExporterConfig())
// Console-Only
val sdk = configureOpenTelemetrySdk(ConsoleExporterConfig())
// File (plain text and JSON formats supported)
val sdk = configureOpenTelemetrySdk(
FileExporterConfig(
filepath = myFile.absolutePathString(),
append = true,
format = OutputFormat.JSON, // default is `OutputFormat.PLAIN_TEXT`
)
)Once the SDK is configured, initialize tracing with TracingManager.setSdk(sdk).
| Property | Environment Variable | Required | Default Value |
|---|---|---|---|
langfuseUrl |
LANGFUSE_URL |
false | https://cloud.langfuse.com |
langfusePublicKey |
LANGFUSE_PUBLIC_KEY |
true | - |
langfuseSecretKey |
LANGFUSE_SECRET_KEY |
true | - |
traceToConsole |
- | false | false |
exporterTimeoutSeconds |
- | false | 10 |
maxNumberOfSpanAttributes |
MAX_NUMBER_OF_SPAN_ATTRIBUTES |
false | 256 |
maxSpanAttributeValueLength |
MAX_SPAN_ATTRIBUTE_VALUE_LENGTH |
false | Int.MAX_VALUE |
| Property | Environment Variable | Required | Default Value |
|---|---|---|---|
weaveUrl |
WEAVE_URL |
false | https://trace.wandb.ai |
weaveEntity |
WEAVE_ENTITY |
true | - |
weaveProjectName |
WEAVE_PROJECT_NAME |
true | - |
weaveApiKey |
WEAVE_API_KEY |
true | - |
traceToConsole |
- | false | false |
exporterTimeoutSeconds |
- | false | 10 |
maxNumberOfSpanAttributes |
MAX_NUMBER_OF_SPAN_ATTRIBUTES |
false | 256 |
maxSpanAttributeValueLength |
MAX_SPAN_ATTRIBUTE_VALUE_LENGTH |
false | Int.MAX_VALUE |
Configuration for exporting OpenTelemetry traces to console only. Console Setup Example
Configuration for exporting OpenTelemetry traces to a file in either JSON or plain text format. File Setup Example
There are Http and gRPC configurations available. Spans can be exported to any OTLP-compatible HTTP or gRPC collector (for example, Jaeger). Jaeger Setup Example
plugin — contains the Kotlin compiler plugins for annotation-based tracing.gradle-tracy-plugin module automatically selects the appropriate KCP version
for your Kotlin compiler and applies it to your Gradle project.publishing — provides a Gradle plugin used for modules that need to be published.tracy version and includes Kotlin DSL logic for publishing artifacts to Space Maven
repositories.examples — contains runnable examples demonstrating how to use various tracy features.
capabilities.tracing
core — the foundational module providing tracing abstractions, annotations,
and OpenTelemetry integration. Manual tracing. Tracing integration for OkHttp clientopenai — tracing integration for the OpenAI client.gemini — tracing integration for the Gemini client.anthropic — tracing integration for the Anthropic client.ktor — tracing integration for the Ktor HTTP client.test-utils — shared utilities for testing tracing functionality across modules.There are a few known limitations of the library in terms of the content propogation. To see complete examples of the
recommended usage patterns, refer to the
ContextPropagationExample.kt file.
Context propagation works automatically in structured coroutines (e.g., withContext, launch).
However, some concurrency models such as runBlocking and raw threads create new execution boundaries and require
manual propagation of the OpenTelemetry context.
runBlocking inside suspend functions:
Use
currentSpanContextElement(...)
to ensure child spans remain linked to their parent. Otherwise, spans become detached and appear as separate traces.
@Trace
suspend fun handleRequestInCoroutine(requestId: String) {
println("Running on thread: ${Thread.currentThread().name}")
// runBlocking without context propagation would detach the trace
runBlocking(currentSpanContextElement(currentCoroutineContext())) {
println("Running inside nested coroutine on thread: ${Thread.currentThread().name}")
processUserRequest(requestId)
}
}thread { ... }):val context = currentSpanContext(currentCoroutineContext())
thread {
context.makeCurrent().use { processUserRequest("REQ-303") }
}Avoid using @Trace on local (nested) functions.
This is a known Kotlin limitation: references to local functions do NOT implement the KCallable
interface correctly. For more details, see the related issue: KT-64873.
This project uses Gradle composite builds. Thus, running plain publish or publishToMavenLocal is NOT enough.
Included builds are not published automatically.
Use this task instead:
./gradlew publishAllToMavenLocalThese tasks publish JARs of all modules, all versions of the Kotlin-compiler plugin, and the Gradle plugin.
Tracy follows semantic versioning:
<major>.<minor>.<patch>
<major>: stays 0 until the library exits beta.<minor>: increases for public releases (first public release: 0.1.0).<patch>: used for internal development iterations and fixes.Read the Contributing Guidelines.
This project and the corresponding community are governed by the JetBrains Open Source and Community Code of Conduct. Please make sure you read it.
Tracy is licensed under the Apache 2.0 License.
Tracy helps you trace, monitor, and evaluate AI-powered features directly from your Kotlin or Java projects.
It provides a unified API to capture structured traces. Fully compatible with observability platforms like Langfuse and Weights & Biases (W&B).
[!Note] This project uses Tracy Official YouTrack Project for issue tracking. Please file bug reports and feature requests there. Issue templates and additional details are available in the Contributing Guidelines.
Standards:
This library implements
the OpenTelemetry Generative AI Semantic Conventions for span
attributes and event naming, ensuring your traces remain compatible with any OpenTelemetry-compliant backend.
You can use it to:
Select the build system that matches your setup:
Add dependencies to the build.gradle.kts file:
plugins {
id("org.jetbrains.ai.tracy") version "0.1.0"
}
dependencies {
// Core Module with Shared Functionalities
implementation("org.jetbrains.ai.tracy:tracy-core:0.1.0")
// Client-specific Auto Tracing (select the one you need)
implementation("org.jetbrains.ai.tracy:tracy-anthropic:0.1.0")
implementation("org.jetbrains.ai.tracy:tracy-gemini:0.1.0")
implementation("org.jetbrains.ai.tracy:tracy-ktor:0.1.0")
implementation("org.jetbrains.ai.tracy:tracy-openai:0.1.0")
}Make sure that you have mavenCentral() in
both your plugin management and project repositories sections.
repositories {
mavenCentral()
}pluginManagement {
repositories {
mavenCentral()
}
}Add dependencies to the build.gradle file:
plugins {
id 'org.jetbrains.ai.tracy' version '0.1.0'
}
dependencies {
implementation 'org.jetbrains.ai.tracy:tracy-core:0.1.0'
// Client-specific Auto Tracing
implementation 'org.jetbrains.ai.tracy:tracy-anthropic:0.1.0'
implementation 'org.jetbrains.ai.tracy:tracy-gemini:0.1.0'
implementation 'org.jetbrains.ai.tracy:tracy-ktor:0.1.0'
implementation 'org.jetbrains.ai.tracy:tracy-openai:0.1.0'
}Make sure that you have mavenCentral() in
both your plugin management and project repositories sections.
repositories {
mavenCentral()
}pluginManagement {
repositories {
mavenCentral()
}
}Add dependencies to the pom.xml file:
<project>
<!-- ... other project config (groupId, artifactId, version, etc.) ... -->
<build>
<plugins>
<plugin>
<groupId>org.jetbrains.kotlin</groupId>
<artifactId>kotlin-maven-plugin</artifactId>
<version>${kotlin.version}</version>
<configuration>
<jvmTarget>19</jvmTarget>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<!--Use the Kotlin version matching your setup (e.g., 2.0.0 or 2.0.20).
Keep the same major and minor numbers but adjust the patch version
to the nearest supported value - either 0 or 20.
Examples:
- 2.1.19 -> 2.1.0
- 2.1.23 -> 2.1.20 -->
<artifactId>tracy-compiler-plugin-2.1.0-jvm</artifactId>
<version>0.1.0</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
<dependencies>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-core-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<!-- Client-specific Auto Tracing -->
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-anthropic-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-gemini-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-ktor-jvm</artifactId>
<version>0.1.0</version>
</dependency>
<dependency>
<groupId>org.jetbrains.ai.tracy</groupId>
<artifactId>tracy-openai-jvm</artifactId>
<version>0.1.0</version>
</dependency>
</dependencies>
</project>
Compatible with Kotlin from 2.0.0 and Java from 17.
If you already have OpenTelemetry set up in your project, versions 1.2+ are supported.
Supports integration with:
1.*–4.*
1.*–2.*
1.8.*–1.38.* (earlier versions unsupported)[!TIP] See the
examplesdirectory for complete and runnable examples!
Tracy provides tracing integrations for multiple clients—not only OpenAI, but also Gemini,
Anthropic,
Ktor, and OkHttp.
All clients can be instrumented in a similar way using the instrument(...) function.
According to OTEL semantic conventions, capturing of sensitive (e.g., user messages or assistant replies) data should be disabled by default (read more here). Therefore, by default Tracy disguises sensitive content by replacing it with a placeholder "REDACTED", preserving the trace structure and recorded attributes.
You may enable tracing of both input and output independently by following either of the ways below:
TRACY_CAPTURE_INPUT=true|false
TRACY_CAPTURE_OUTPUT=true|false// Call a helper method:
TracingManager.traceSensitiveContent()
// Or equivalently:
TracingManager.withCapturingPolicy(
ContentCapturePolicy(
captureInputs = true,
captureOutputs = true,
)
)Below is a minimal OpenAI example. For others, check the examples directory:
// Enable tracing via the `IS_TRACY_ENABLED` environment variable
// or programmatically, as shown below:
TracingManager.isTracingEnabled = true
// Initialize tracing and export spans to the console
val sdk: OpenTelemetrySdk = configureOpenTelemetrySdk(
exporterConfig = ConsoleExporterConfig()
)
TracingManager.setSdk(sdk)
// Permit tracing of sensitive content in requests AND responses
TracingManager.traceSensitiveContent()
// Create an OpenAI client and instrument it with tracing capabilities
val instrumentedClient: OpenAIClient = OpenAIOkHttpClient.builder()
.baseUrl(url)
.apiKey(apiKey)
.timeout(timeout)
.build()
.apply { instrument(this) }
// Make request and receive response
val request = ChatCompletionCreateParams.builder()
.addUserMessage("Generate a polite greeting and introduce yourself.")
.model(ChatModel.GPT_4O_MINI)
.temperature(0.0)
.build()
val response = instrumentedClient.chat().completions().create(request)
println("OpenAI response: ${response.choices().first().message().content().get()}")
// Traces are automatically flushed based on ExporterCommonSettings:
// - Periodically via flushIntervalMs and flushThreshold
// - On JVM shutdown if flushOnShutdown = true (default)
// For manual control, use TracingManager.flushTraces() This example uses simple console tracing for a demonstration. For information about other tracing backends, see Tracing Backends
Full example: OpenAI Client Auto Tracing Example
Runtime toggle: Spans are emitted only if both conditions are met:
TracingManager.setSdk(...)
Runtime tracing is controlled through TracingManager.isTracingEnabled, which defaults to the value of the
IS_TRACY_ENABLED environment variable. if the variable is not set, tracing is disabled by default. You can also
change this value programmatically at any time.Compile-time plugin toggle:
enableTracyPlugin. The property itself is optional.
If you don’t define it, the plugin is enabled by default, but you can turn it off whenever needed by setting
enableTracyPlugin=false in gradle.properties or by passing it on the command line with
-PenableTracyPlugin=false.Configuration:
Behavior:
TracingManager.isTracingEnabled.You can trace regular functions (not only client calls)
using the @Trace
annotation.
Make sure to apply the org.jetbrains.ai.tracy plugin in your build.
The Kotlin compiler plugin automatically instruments annotated functions, capturing execution details such as start and end time, duration, inputs, and outputs.
⚠️ Annotation-based tracing is supported only in Kotlin. For Java, use Manual Tracing instead.
@Trace(name = "GreetUser")
fun greetUser(name: String): String {
println("Hello, $name!")
return "Greeting sent to $name"
}
fun main() {
// Enable tracing via the `IS_TRACY_ENABLED` environment variable
// or programmatically, as shown below:
TracingManager.isTracingEnabled = true
TracingManager.setSdk(configureOpenTelemetrySdk(ConsoleExporterConfig()))
greetUser("Alice")
// Traces are automatically flushed based on ExporterCommonSettings:
// - Periodically via flushIntervalMs and flushThreshold
// - On JVM shutdown if flushOnShutdown = true (default)
// For manual control, use TracingManager.flushTraces()
}Nested calls between annotated functions are traced automatically.
When one traced function invokes another, a hierarchical trace structure is created the outer call is recorded as a
parent span and the inner call as its child span.
This structure provides a clear visual representation of call relationships and execution flow.
See the NestedSpansExample.kt for a
demonstration of hierarchical tracing.
Tracing annotation is automatically propagated through interfaces and class hierarchies.
If a method in an interface or superclass is annotated with
@Trace, all
overriding or implementing methods
inherit tracing behavior automatically, even if the annotation is not explicitly declared again.
This approach ensures consistent and reusable tracing across an entire inheritance chain without code duplication.
Refer to the
TracingPropagationExample.kt for a
complete example.
The tracing system offers flexible customization through the
SpanMetadataCustomizer
interface. Implement it as a Kotlin object (classes are not supported) and reference it in the
@Trace annotation.
With a custom SpanMetadataCustomizer, you can define:
See an example implementation in
MetadataCustomizerExample.kt.
You can enrich your traces with contextual metadata by adding custom tags.
Tags help categorize and filter traces based on your business logic. For example, by user type, feature, or
environment. Use the
addLangfuseTagsToCurrentTrace
function to
attach
tags dynamically within any traced function.
These tags appear in Langfuse or other tracing tools, making it easier to group and analyze trace data.
An example demonstrating how to provide custom tags can be found in
LangfuseTagExample.kt
In addition to annotation-based tracing, you can manually create and manage spans wherever you need them in your code.
This approach gives you full control over what is traced and what metadata is recorded.
Use the withSpan function to define a traceable block of code.
It automatically creates and closes a span around the specified code section, allowing you to attach structured metadata
such as inputs, outputs, or custom attributes.
Manual tracing is especially useful in Java projects where annotation-based tracing is not available.
An example demonstrating manual tracing can be found in
ManualTracingExample.kt.
Tracy provides integrations with multiple tracing backends, allowing you to choose the one that best fits your workflow. Currently supported backends include:
Create an SDK by providing the desired configuration, then initialize tracing with TracingManager.setSdk(sdk)
// Langfuse
val sdk = configureOpenTelemetrySdk(LangfuseExporterConfig())
// Weave
val sdk = configureOpenTelemetrySdk(WeaveExporterConfig())
// Console-Only
val sdk = configureOpenTelemetrySdk(ConsoleExporterConfig())
// File (plain text and JSON formats supported)
val sdk = configureOpenTelemetrySdk(
FileExporterConfig(
filepath = myFile.absolutePathString(),
append = true,
format = OutputFormat.JSON, // default is `OutputFormat.PLAIN_TEXT`
)
)Once the SDK is configured, initialize tracing with TracingManager.setSdk(sdk).
| Property | Environment Variable | Required | Default Value |
|---|---|---|---|
langfuseUrl |
LANGFUSE_URL |
false | https://cloud.langfuse.com |
langfusePublicKey |
LANGFUSE_PUBLIC_KEY |
true | - |
langfuseSecretKey |
LANGFUSE_SECRET_KEY |
true | - |
traceToConsole |
- | false | false |
exporterTimeoutSeconds |
- | false | 10 |
maxNumberOfSpanAttributes |
MAX_NUMBER_OF_SPAN_ATTRIBUTES |
false | 256 |
maxSpanAttributeValueLength |
MAX_SPAN_ATTRIBUTE_VALUE_LENGTH |
false | Int.MAX_VALUE |
| Property | Environment Variable | Required | Default Value |
|---|---|---|---|
weaveUrl |
WEAVE_URL |
false | https://trace.wandb.ai |
weaveEntity |
WEAVE_ENTITY |
true | - |
weaveProjectName |
WEAVE_PROJECT_NAME |
true | - |
weaveApiKey |
WEAVE_API_KEY |
true | - |
traceToConsole |
- | false | false |
exporterTimeoutSeconds |
- | false | 10 |
maxNumberOfSpanAttributes |
MAX_NUMBER_OF_SPAN_ATTRIBUTES |
false | 256 |
maxSpanAttributeValueLength |
MAX_SPAN_ATTRIBUTE_VALUE_LENGTH |
false | Int.MAX_VALUE |
Configuration for exporting OpenTelemetry traces to console only. Console Setup Example
Configuration for exporting OpenTelemetry traces to a file in either JSON or plain text format. File Setup Example
There are Http and gRPC configurations available. Spans can be exported to any OTLP-compatible HTTP or gRPC collector (for example, Jaeger). Jaeger Setup Example
plugin — contains the Kotlin compiler plugins for annotation-based tracing.gradle-tracy-plugin module automatically selects the appropriate KCP version
for your Kotlin compiler and applies it to your Gradle project.publishing — provides a Gradle plugin used for modules that need to be published.tracy version and includes Kotlin DSL logic for publishing artifacts to Space Maven
repositories.examples — contains runnable examples demonstrating how to use various tracy features.
capabilities.tracing
core — the foundational module providing tracing abstractions, annotations,
and OpenTelemetry integration. Manual tracing. Tracing integration for OkHttp clientopenai — tracing integration for the OpenAI client.gemini — tracing integration for the Gemini client.anthropic — tracing integration for the Anthropic client.ktor — tracing integration for the Ktor HTTP client.test-utils — shared utilities for testing tracing functionality across modules.There are a few known limitations of the library in terms of the content propogation. To see complete examples of the
recommended usage patterns, refer to the
ContextPropagationExample.kt file.
Context propagation works automatically in structured coroutines (e.g., withContext, launch).
However, some concurrency models such as runBlocking and raw threads create new execution boundaries and require
manual propagation of the OpenTelemetry context.
runBlocking inside suspend functions:
Use
currentSpanContextElement(...)
to ensure child spans remain linked to their parent. Otherwise, spans become detached and appear as separate traces.
@Trace
suspend fun handleRequestInCoroutine(requestId: String) {
println("Running on thread: ${Thread.currentThread().name}")
// runBlocking without context propagation would detach the trace
runBlocking(currentSpanContextElement(currentCoroutineContext())) {
println("Running inside nested coroutine on thread: ${Thread.currentThread().name}")
processUserRequest(requestId)
}
}thread { ... }):val context = currentSpanContext(currentCoroutineContext())
thread {
context.makeCurrent().use { processUserRequest("REQ-303") }
}Avoid using @Trace on local (nested) functions.
This is a known Kotlin limitation: references to local functions do NOT implement the KCallable
interface correctly. For more details, see the related issue: KT-64873.
This project uses Gradle composite builds. Thus, running plain publish or publishToMavenLocal is NOT enough.
Included builds are not published automatically.
Use this task instead:
./gradlew publishAllToMavenLocalThese tasks publish JARs of all modules, all versions of the Kotlin-compiler plugin, and the Gradle plugin.
Tracy follows semantic versioning:
<major>.<minor>.<patch>
<major>: stays 0 until the library exits beta.<minor>: increases for public releases (first public release: 0.1.0).<patch>: used for internal development iterations and fixes.Read the Contributing Guidelines.
This project and the corresponding community are governed by the JetBrains Open Source and Community Code of Conduct. Please make sure you read it.
Tracy is licensed under the Apache 2.0 License.