LangSmith Semantic Conventions
Spring AI provides built-in instrumentation for models and workflows based on the Micrometer APIs. Arconia lets you configure your Spring AI applications to export telemetry data adopting the LangSmith Semantic Conventions instead of the default Micrometer conventions, enabling rich trace visualization in the LangSmith platform.
|
The LangSmith Semantic Conventions are still experimental, and they might change in the future. |
Getting Started
Add the Arconia OpenTelemetry AI Semantic Conventions dependency to your project. The module auto-activates when present on the classpath.
-
Gradle
-
Maven
dependencies {
implementation 'io.arconia:arconia-opentelemetry-ai-semantic-conventions'
}
<dependency>
<groupId>io.arconia</groupId>
<artifactId>arconia-opentelemetry-ai-semantic-conventions</artifactId>
</dependency>
|
If another Generative AI convention module is also active, Arconia fails at startup with an actionable error message. Disable the unwanted module via its |
Set the convention flavor to langsmith:
arconia.observations.conventions.opentelemetry.ai.flavor=langsmith
Your application also needs OpenTelemetry configured for exporting traces. The recommended approach is to use the Arconia OpenTelemetry Spring Boot Starter:
-
Gradle
-
Maven
dependencies {
implementation 'io.arconia:arconia-spring-boot-starter-opentelemetry'
}
<dependency>
<groupId>io.arconia</groupId>
<artifactId>arconia-spring-boot-starter-opentelemetry</artifactId>
</dependency>
Configure the OTLP exporter to send traces to LangSmith, making sure you defined your LangSmith API Key in a LANGSMITH_API_KEY environment variable:
arconia.otel.exporter.otlp.endpoint=https://eu.api.smith.langchain.com/otel
arconia.otel.exporter.otlp.headers=x-api-key=${LANGSMITH_API_KEY}
For US-hosted LangSmith, use https://api.smith.langchain.com/otel as the endpoint.
Alternatively, you can use the standard OpenTelemetry Environment Variables:
OTEL_EXPORTER_OTLP_ENDPOINT=https://eu.api.smith.langchain.com/otel
OTEL_EXPORTER_OTLP_HEADERS="x-api-key=<your-langsmith-api-key>"
Since LangSmith supports only traces, you might want to disable the export of logs and metrics:
arconia.otel.logs.exporter.type=none
arconia.otel.metrics.exporter.type=none
Features
The module provides semantic conventions for the following observation types:
-
Chat model spans with full parameter tracking (temperature, max tokens, penalties, stop sequences, etc.).
-
Embedding model spans.
-
Tool execution spans with input/output content.
-
Chat client spans with session tracking and input/output content.
-
Token usage for inference operations.
-
Input/output message content, captured as span attributes.
Content capturing is enabled by default, matching the behavior of the official LangSmith SDKs.
Configuration Properties
| Property | Default | Description |
|---|---|---|
|
|
Whether to enable the OpenTelemetry AI Semantic Conventions module. |
|
|
The convention flavor to apply. Accepted values: |
|
|
How to capture input and output message content. Accepted values: |
|
|
Whether to include tool definitions in inference observations. |
|
|
Whether to include tool content (arguments and result) in tool execution observations. |