OpenInference Semantic Conventions
Spring AI provides built-in instrumentation for models, vector stores, and workflows based on the Micrometer APIs. Arconia lets you configure your Spring AI applications to export telemetry data adopting the OpenInference Semantic Conventions instead of the default Micrometer conventions, enabling rich trace visualization in platforms like Arize Phoenix.
Arconia provides a Phoenix Dev Service you can use during development and testing. Check out the Phoenix Dev Service documentation for more details.
|
The OpenInference Semantic Conventions are still experimental, and they might change in the future. |
Getting Started
Add the Arconia OpenInference Semantic Conventions dependency to your project. The module auto-activates when present on the classpath.
-
Gradle
-
Maven
dependencies {
implementation 'io.arconia:arconia-openinference-semantic-conventions'
}
<dependency>
<groupId>io.arconia</groupId>
<artifactId>arconia-openinference-semantic-conventions</artifactId>
</dependency>
Your application also needs OpenTelemetry configured for exporting traces. The recommended approach is to use the Arconia OpenTelemetry Spring Boot Starter:
-
Gradle
-
Maven
dependencies {
implementation 'io.arconia:arconia-spring-boot-starter-opentelemetry'
}
<dependency>
<groupId>io.arconia</groupId>
<artifactId>arconia-spring-boot-starter-opentelemetry</artifactId>
</dependency>
If you use the Phoenix Dev Service during development or testing, Arconia will configure OpenTelemetry automatically to export traces to Phoenix based on the OpenInference Semantic Conventions.
Features
The module provides semantic conventions for the following observation types:
-
Chat model spans with full parameter tracking (e.g., temperature, max tokens, penalties, stop sequences).
-
Embedding model spans with dimension tracking.
-
Tool execution spans.
-
Chat client observations (including conversation ID tracking via chat memory).
-
Token usage metrics.
-
Input/output message content, captured as span attributes or span events.
Content capturing is enabled by default, according to the OpenInference Semantic Conventions specification.
Configuration Properties
You can configure the OpenInference Semantic Conventions via configuration properties.
| Property | Default | Description |
|---|---|---|
|
- |
Name of the project in the OpenInference backend where to send the telemetry data. If not defined, the application name will be used as the project name. |
|
|
Maximum length of a base64-encoded image. |
|
|
Whether to hide all LLM choices. |
|
|
Whether to hide all embedding text. |
|
|
Whether to hide all embedding vectors. |
|
|
Whether to hide the LLM invocation parameters. |
|
|
Whether to hide all inputs. |
|
|
Whether to hide all images from the input messages. |
|
|
Whether to hide all input messages. |
|
|
Whether to hide all texts from the input messages. |
|
|
Whether to hide all outputs. |
|
|
Whether to hide all texts from the output messages. |
|
|
Whether to hide all output messages. |
|
|
Whether to hide all LLM prompts. |
Environment Variables
Arconia supports the OpenInference Environment Variable Specification for configuring the OpenInference semantic conventions. If both the OpenInference Environment Variables and the Arconia configuration properties are set, the OpenInference Environment Variables will take precedence.