Skip to main content
  1. Sign up — Create an account at platform.respan.ai
  2. Create an API key — Generate one on the API keys page
  3. Add credits or a provider key — Add credits on the Credits page or connect your own provider key on the Integrations page
Add the Docs MCP to your AI coding tool to get help building with Respan. No API key needed.
{
  "mcpServers": {
    "respan-docs": {
      "url": "https://docs.respan.ai/mcp"
    }
  }
}

What is Claude Agent SDK?

The Claude Agent SDK (anthropic) is Anthropic’s official Python SDK for the Claude API. It supports chat completions, tool use, streaming, and structured outputs.

Setup

1

Install packages

pip install anthropic respan-ai openinference-instrumentation-anthropic python-dotenv
2

Set environment variables

export ANTHROPIC_API_KEY="YOUR_ANTHROPIC_API_KEY"
export RESPAN_API_KEY="YOUR_RESPAN_API_KEY"
3

Initialize and run

import os
from dotenv import load_dotenv

load_dotenv()

import anthropic
from respan import Respan
from openinference.instrumentation.anthropic import AnthropicInstrumentor

# Initialize Respan with Anthropic instrumentation
respan = Respan(instrumentations=[AnthropicInstrumentor()])

client = anthropic.Anthropic()

message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Say hello in three languages."}],
)
print(message.content[0].text)
respan.flush()
4

View your trace

Open the Traces page to see your auto-instrumented LLM spans.
This step applies to Tracing and Both setups. The Gateway-only setup does not produce traces.

Configuration

ParameterTypeDefaultDescription
api_keystr | NoneNoneFalls back to RESPAN_API_KEY env var.
base_urlstr | NoneNoneFalls back to RESPAN_BASE_URL env var.
instrumentationslist[]Plugin instrumentations to activate (e.g. AnthropicInstrumentor()).
is_auto_instrumentbool | NoneFalseAuto-discover and activate all installed instrumentors via OpenTelemetry entry points.
customer_identifierstr | NoneNoneDefault customer identifier for all spans.
metadatadict | NoneNoneDefault metadata attached to all spans.
environmentstr | NoneNoneEnvironment tag (e.g. "production").

Attributes

In Respan()

Set defaults at initialization — these apply to all spans.
from respan import Respan
from openinference.instrumentation.anthropic import AnthropicInstrumentor

respan = Respan(
    instrumentations=[AnthropicInstrumentor()],
    customer_identifier="user_123",
    metadata={"service": "claude-api", "version": "1.0.0"},
)

With propagate_attributes

Override per-request using a context manager.
from respan import Respan, workflow, propagate_attributes
from openinference.instrumentation.anthropic import AnthropicInstrumentor

respan = Respan(instrumentations=[AnthropicInstrumentor()])

@workflow(name="handle_request")
def handle_request(user_id: str, question: str):
    with propagate_attributes(
        customer_identifier=user_id,
        thread_identifier="conv_001",
        metadata={"plan": "pro"},
    ):
        message = client.messages.create(
            model="claude-sonnet-4-20250514",
            max_tokens=1024,
            messages=[{"role": "user", "content": question}],
        )
        print(message.content[0].text)
AttributeTypeDescription
customer_identifierstrIdentifies the end user in Respan analytics.
thread_identifierstrGroups related messages into a conversation.
metadatadictCustom key-value pairs. Merged with default metadata.

Decorators

Use @workflow and @task to create structured trace hierarchies.
from respan import Respan, workflow, task
from openinference.instrumentation.anthropic import AnthropicInstrumentor

respan = Respan(instrumentations=[AnthropicInstrumentor()])

@task(name="summarize")
def summarize(text: str) -> str:
    message = client.messages.create(
        model="claude-sonnet-4-20250514",
        max_tokens=1024,
        messages=[
            {"role": "user", "content": f"Summarize: {text}"},
        ],
    )
    return message.content[0].text

@workflow(name="content_pipeline")
def pipeline(topic: str):
    summary = summarize(topic)
    print(summary)

pipeline("The benefits of observability in LLM applications")
respan.flush()

Examples

Basic

message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Explain quantum computing in one sentence."}],
)
print(message.content[0].text)

Streaming

with client.messages.stream(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "Write a haiku about Python."}],
) as stream:
    for text in stream.text_stream:
        print(text, end="", flush=True)

Tool calls

import json

tools = [
    {
        "name": "get_weather",
        "description": "Get the weather for a city.",
        "input_schema": {
            "type": "object",
            "properties": {"city": {"type": "string"}},
            "required": ["city"],
        },
    }
]

message = client.messages.create(
    model="claude-sonnet-4-20250514",
    max_tokens=1024,
    messages=[{"role": "user", "content": "What's the weather in Paris?"}],
    tools=tools,
)

for block in message.content:
    if block.type == "tool_use":
        result = f"Sunny, 72F in {block.input['city']}"
        follow_up = client.messages.create(
            model="claude-sonnet-4-20250514",
            max_tokens=1024,
            messages=[
                {"role": "user", "content": "What's the weather in Paris?"},
                {"role": "assistant", "content": message.content},
                {"role": "user", "content": [
                    {"type": "tool_result", "tool_use_id": block.id, "content": result}
                ]},
            ],
            tools=tools,
        )
        print(follow_up.content[0].text)

Gateway

Route all Anthropic calls through the Respan gateway:
import anthropic

client = anthropic.Anthropic(
    api_key=os.getenv("RESPAN_API_KEY"),
    base_url="https://api.respan.ai/api/anthropic",
)
With the gateway, you can switch models across providers and use Respan prompt management features.