Skip to main content
  1. Sign up — Create an account at platform.respan.ai
  2. Create an API key — Generate one on the API keys page
  3. Add credits or a provider key — Add credits on the Credits page or connect your own provider key on the Integrations page
Add the Docs MCP to your AI coding tool to get help building with Respan. No API key needed.
{
  "mcpServers": {
    "respan-docs": {
      "url": "https://respan.ai/docs/mcp"
    }
  }
}

What is Langfuse?

Langfuse is an open-source LLM observability platform. The Respan instrumentor patches Langfuse’s OTLP exporter to redirect traces to Respan — your existing @observe decorators and langfuse.trace() calls continue to work unchanged.

Setup

1

Install packages

pip install langfuse respan-instrumentation-langfuse
2

Set environment variables

export RESPAN_API_KEY="YOUR_RESPAN_API_KEY"
export LANGFUSE_PUBLIC_KEY="YOUR_LANGFUSE_PUBLIC_KEY"
export LANGFUSE_SECRET_KEY="YOUR_LANGFUSE_SECRET_KEY"
3

Instrument before importing Langfuse

LangfuseInstrumentor().instrument() must be called before importing langfuse. The instrumentor patches the OTLP exporter at import time.
import os
from respan_instrumentation_langfuse import LangfuseInstrumentor

# Instrument BEFORE importing langfuse
LangfuseInstrumentor().instrument()

# Now import and use langfuse normally
from langfuse.decorators import observe
from openai import OpenAI

client = OpenAI()

@observe()
def generate_joke():
    response = client.chat.completions.create(
        model="gpt-4o-mini",
        messages=[{"role": "user", "content": "Tell me a joke about AI"}],
    )
    return response.choices[0].message.content

@observe()
def joke_pipeline():
    joke = generate_joke()
    return joke

result = joke_pipeline()
print(result)
4

View your trace

Open the Traces page to see your Langfuse traces in Respan.
Langfuse trace in Respan

Configuration

See the Langfuse Instrumentor SDK reference for the full API.