OpenInference Semantic Conventions

Spring AI provides built-in instrumentation for models, vector stores, and workflows based on the Micrometer APIs. Arconia lets you configure your Spring AI applications to export telemetry data adopting the OpenInference Semantic Conventions instead of the default Micrometer conventions, enabling rich trace visualization in platforms like Arize Phoenix.

Arconia provides a Phoenix Dev Service you can use during development and testing. Check out the Phoenix Dev Service documentation for more details.

The OpenInference Semantic Conventions are still experimental, and they might change in the future.

Getting Started

Add the Arconia OpenInference Semantic Conventions dependency to your project. The module auto-activates when present on the classpath.

  • Gradle

  • Maven

dependencies {
  implementation 'io.arconia:arconia-openinference-semantic-conventions'
}
<dependency>
    <groupId>io.arconia</groupId>
    <artifactId>arconia-openinference-semantic-conventions</artifactId>
</dependency>

Your application also needs OpenTelemetry configured for exporting traces. The recommended approach is to use the Arconia OpenTelemetry Spring Boot Starter:

  • Gradle

  • Maven

dependencies {
  implementation 'io.arconia:arconia-spring-boot-starter-opentelemetry'
}
<dependency>
    <groupId>io.arconia</groupId>
    <artifactId>arconia-spring-boot-starter-opentelemetry</artifactId>
</dependency>

If you use the Phoenix Dev Service during development or testing, Arconia will configure OpenTelemetry automatically to export traces to Phoenix based on the OpenInference Semantic Conventions.

Features

The module provides semantic conventions for the following observation types:

  • Chat model spans with full parameter tracking (e.g., temperature, max tokens, penalties, stop sequences).

  • Embedding model spans with dimension tracking.

  • Tool execution spans.

  • Chat client observations (including conversation ID tracking via chat memory).

  • Token usage metrics.

  • Input/output message content, captured as span attributes or span events.

Content capturing is enabled by default, according to the OpenInference Semantic Conventions specification.

Configuration Properties

You can configure the OpenInference Semantic Conventions via configuration properties.

Table 1. OpenInference Configuration Properties
Property Default Description

arconia.observations.conventions.openinference.project-name

-

Name of the project in the OpenInference backend where to send the telemetry data. If not defined, the application name will be used as the project name.

arconia.observations.conventions.openinference.base64-image-max-length

32 000

Maximum length of a base64-encoded image.

arconia.observations.conventions.openinference.hide-choices

false

Whether to hide all LLM choices.

arconia.observations.conventions.openinference.hide-embeddings-text

false

Whether to hide all embedding text.

arconia.observations.conventions.openinference.hide-embeddings-vectors

false

Whether to hide all embedding vectors.

arconia.observations.conventions.openinference.hide-llm-invocation-parameters

false

Whether to hide the LLM invocation parameters.

arconia.observations.conventions.openinference.hide-inputs

false

Whether to hide all inputs.

arconia.observations.conventions.openinference.hide-input-images

false

Whether to hide all images from the input messages.

arconia.observations.conventions.openinference.hide-input-messages

false

Whether to hide all input messages.

arconia.observations.conventions.openinference.hide-input-text

false

Whether to hide all texts from the input messages.

arconia.observations.conventions.openinference.hide-outputs

false

Whether to hide all outputs.

arconia.observations.conventions.openinference.hide-output-text

false

Whether to hide all texts from the output messages.

arconia.observations.conventions.openinference.hide-output-messages

false

Whether to hide all output messages.

arconia.observations.conventions.openinference.hide-prompts

false

Whether to hide all LLM prompts.

Environment Variables

Arconia supports the OpenInference Environment Variable Specification for configuring the OpenInference semantic conventions. If both the OpenInference Environment Variables and the Arconia configuration properties are set, the OpenInference Environment Variables will take precedence.