LangSmith Semantic Conventions

Spring AI provides built-in instrumentation for models and workflows based on the Micrometer APIs. Arconia lets you configure your Spring AI applications to export telemetry data adopting the LangSmith Semantic Conventions instead of the default Micrometer conventions, enabling rich trace visualization in the LangSmith platform.

The LangSmith Semantic Conventions are still experimental, and they might change in the future.

Getting Started

Add the Arconia LangSmith Semantic Conventions dependency to your project. The module auto-activates when present on the classpath.

  • Gradle

  • Maven

dependencies {
  implementation 'io.arconia:arconia-langsmith-semantic-conventions'
}
<dependency>
    <groupId>io.arconia</groupId>
    <artifactId>arconia-langsmith-semantic-conventions</artifactId>
</dependency>

Your application also needs OpenTelemetry configured for exporting traces. The recommended approach is to use the Arconia OpenTelemetry Spring Boot Starter:

  • Gradle

  • Maven

dependencies {
  implementation 'io.arconia:arconia-spring-boot-starter-opentelemetry'
}
<dependency>
    <groupId>io.arconia</groupId>
    <artifactId>arconia-spring-boot-starter-opentelemetry</artifactId>
</dependency>

Then, configure the OTLP exporter to send traces to LangSmith, making sure you defined your LangSmith API Key in a LANGSMITH_API_KEY environment variable:

otel.exporter.otlp.endpoint=https://eu.api.smith.langchain.com/otel
otel.exporter.otlp.headers=x-api-key=${LANGSMITH_API_KEY}

For US-hosted LangSmith, use https://api.smith.langchain.com/otel as the endpoint.

Alternatively, you can use the standard OpenTelemetry Environment Variables:

OTEL_EXPORTER_OTLP_ENDPOINT=https://eu.api.smith.langchain.com/otel
OTEL_EXPORTER_OTLP_HEADERS="x-api-key=<your-langsmith-api-key>"

Features

The module provides semantic conventions for the following observation types:

  • Chat model spans with full parameter tracking (temperature, max tokens, penalties, stop sequences, etc.).

  • Embedding model spans.

  • Tool execution spans with input/output content.

  • Chat client spans with session tracking and input/output content.

  • Token usage for inference operations.

  • Input/output message content, captured as span events.

Content capturing is enabled by default, matching the behavior of the official LangSmith SDKs.

Configuration Properties

Inference

Table 1. LangSmith Inference Configuration Properties
Property Default Description

arconia.observations.conventions.langsmith.inference.include-content

true

Whether to include input and output message content in inference observations.

arconia.observations.conventions.langsmith.inference.include-tool-definitions

true

Whether to include tool definitions in inference observations.

Tool Execution

Table 2. LangSmith Tool Execution Configuration Properties
Property Default Description

arconia.observations.conventions.langsmith.tool-execution.include-content

true

Whether to include tool content (arguments and result) in tool execution observations.