OpenLLMetry Semantic Conventions

Spring AI provides built-in instrumentation for models, vector stores, and workflows based on the Micrometer APIs. Arconia lets you configure your Spring AI applications to export telemetry data adopting the OpenLLMetry Semantic Conventions instead of the default Micrometer conventions, enabling rich trace visualization in platforms like Traceloop.

The OpenLLMetry Semantic Conventions are still experimental, and they might change in the future.

Getting Started

Add the Arconia OpenTelemetry AI Semantic Conventions dependency to your project. The module auto-activates when present on the classpath.

  • Gradle

  • Maven

dependencies {
  implementation 'io.arconia:arconia-opentelemetry-ai-semantic-conventions'
}
<dependency>
    <groupId>io.arconia</groupId>
    <artifactId>arconia-opentelemetry-ai-semantic-conventions</artifactId>
</dependency>

If another Generative AI convention module is also active, Arconia fails at startup with an actionable error message. Disable the unwanted module via its enabled property.

Set the convention flavor to openllmetry:

arconia.observations.conventions.opentelemetry.ai.flavor=openllmetry

Your application also needs OpenTelemetry configured for exporting traces. The recommended approach is to use the Arconia OpenTelemetry Spring Boot Starter:

  • Gradle

  • Maven

dependencies {
  implementation 'io.arconia:arconia-spring-boot-starter-opentelemetry'
}
<dependency>
    <groupId>io.arconia</groupId>
    <artifactId>arconia-spring-boot-starter-opentelemetry</artifactId>
</dependency>

Traceloop

Configure the OTLP exporter to send traces to Traceloop, making sure you defined your Traceloop API Key in a TRACELOOP_API_KEY environment variable:

arconia.otel.exporter.otlp.endpoint=https://api.traceloop.com
arconia.otel.exporter.otlp.headers=Authorization=Bearer ${TRACELOOP_API_KEY}

Alternatively, you can use the standard OpenTelemetry environment variables:

OTEL_EXPORTER_OTLP_ENDPOINT=https://api.traceloop.com
OTEL_EXPORTER_OTLP_HEADERS="Authorization=Bearer <your-traceloop-api-key>"

Since Traceloop supports only traces, you might want to disable the export of logs and metrics:

arconia.otel.logs.exporter.type=none
arconia.otel.metrics.exporter.type=none

Features

The module provides OpenLLMetry-compatible semantic conventions for all Spring AI observation types.

  • Chat model spans with full parameter tracking (temperature, max tokens, penalties, stop sequences, etc.).

  • Embedding model spans.

  • Tool execution spans with input/output content.

  • Chat client spans with session tracking and input/output content.

  • Token usage for inference operations.

  • Input/output message content, captured as span attributes.

Content capturing is enabled by default, matching the behavior of the official OpenLLMetry SDKs.

Configuration Properties

Table 1. OpenTelemetry AI Configuration Properties
Property Default Description

arconia.observations.conventions.opentelemetry.ai.enabled

true

Whether to enable the OpenTelemetry AI Semantic Conventions module.

arconia.observations.conventions.opentelemetry.ai.flavor

opentelemetry

The convention flavor to apply. Accepted values: opentelemetry, openllmetry, langsmith.

arconia.observations.conventions.opentelemetry.ai.capture-content

span-attributes

How to capture input and output message content. Accepted values: none, span-attributes, span-events.

arconia.observations.conventions.opentelemetry.ai.include-tool-definitions

true

Whether to include tool definitions in inference observations.

arconia.observations.conventions.opentelemetry.ai.include-tool-call-content

true

Whether to include tool content (arguments and result) in tool execution observations.