Automatic Instrumentation
Auto-instrument popular libraries like OpenAI and Anthropic
Overview
The Respan Tracing SDK can automatically instrument popular LLM libraries, capturing all API calls without manual tracing code.
Supported Libraries
Setup
OpenAI Instrumentation
Anthropic Instrumentation
Multi-Provider Instrumentation
What Gets Traced
OpenAI
- Chat Completions:
openai.chat.completions.create() - Streaming:
openai.chat.completions.create({ stream: true }) - Embeddings:
openai.embeddings.create() - Images:
openai.images.generate()
Captured data:
- Model name
- Messages/prompts
- Response content
- Token usage
- Latency
- Errors
Anthropic
- Messages:
anthropic.messages.create() - Streaming:
anthropic.messages.create({ stream: true })
Captured data:
- Model name
- Messages
- Response content
- Token usage
- Latency
- Errors
Configuration Options
Disable Specific Instrumentation
No Instrumentation
Manual Tracing with Auto-Instrumentation
You can combine auto-instrumentation with manual tracing:
Streaming Support
Auto-instrumentation works with streaming:
Error Tracking
Auto-instrumentation captures errors:
Best Practices
- Always pass the library class (not an instance) to
instrumentModules - Initialize auto-instrumentation before creating SDK instances
- Combine auto-instrumentation with manual tracing for complete visibility
- Auto-instrumentation captures all SDK calls within traced contexts
- Use manual tracing for business logic around LLM calls
- Auto-instrumentation has minimal performance overhead
Troubleshooting
Instrumentation Not Working
Ensure you:
- Pass the class to
instrumentModules(e.g.,OpenAI, notopenai) - Call
initialize()before creating SDK instances - Wrap calls in
withWorkflow,withTask,withAgent, orwithTool - Use the latest version of the Respan Tracing SDK
Example Debug
Future Support
Additional libraries will be supported in future versions. Check the documentation for updates.