Set up Respan
Set up Respan
- Sign up — Create an account at platform.respan.ai
- Create an API key — Generate one on the API keys page
- Add credits or a provider key — Add credits on the Credits page or connect your own provider key on the Integrations page
What is Vercel AI?
The Vercel AI SDK is a TypeScript toolkit for building AI-powered applications with Next.js. This guide shows how to set up Respan with Vercel AI for both tracing and gateway routing.Setup
Set environment variables
Add your Respan credentials and your provider key to
.env.local:- OpenAI
- Anthropic
- Google Gemini
.env.local
Set up OpenTelemetry instrumentation
Create
instrumentation.ts in your project root (where package.json lives):instrumentation.ts
Enable telemetry in your route
In your API route (e.g.
app/api/chat/route.ts), enable telemetry:- OpenAI
- Anthropic
- Google Gemini
app/api/chat/route.ts
Run and verify
Start your dev server and make some chat requests:Open the Traces page to confirm requests are being traced.
Configuration
TheRespanExporter constructor accepts:
| Parameter | Required | Description |
|---|---|---|
apiKey | Yes | Your Respan API key. |
baseUrl | No | Respan API base URL. Defaults to https://api.respan.ai. |
debug | No | Enable debug logging. Defaults to false. |
Attributes
Attach Respan-specific parameters to your traces via theexperimental_telemetry option on any AI SDK call.
Via metadata
Via header
Encode parameters as a base64 JSON header for full control:Supported attributes
| Attribute | Description |
|---|---|
customer_identifier | Customer or user identifier |
thread_id | Thread or conversation identifier |
metadata | Custom key-value pairs attached to the trace |
prompt_unit_price | Custom input token price |
completion_unit_price | Custom output token price |
Gateway
Route LLM calls through the Respan gateway for automatic logging, fallbacks, and cost optimization. Override thebaseURL in your provider SDK to point at Respan.
Compatibility
| SDK helper | Works via Respan? | Switch models? |
|---|---|---|
@ai-sdk/openai | Yes | Yes |
@ai-sdk/anthropic | Yes (Anthropic models only) | No |
@ai-sdk/google | Yes | Yes |
Gateway examples
- OpenAI
- Anthropic
- Google Gemini
Passing Respan parameters via gateway
To attach Respan parameters (likecustomer_identifier) when using the gateway, encode them as a base64 header:
Observability
With this integration, Respan auto-captures:- AI model calls — requests made via the Vercel AI SDK
- Token usage — input and output token counts
- Performance metrics — latency and throughput
- Errors — failed requests and error details
- Custom metadata — additional context attached via telemetry metadata/headers