Custom Provider
Set up a custom or self-hosted LLM provider with the Respan gateway.
Set up Respan
- Sign up — Create an account at platform.respan.ai
- Create an API key — Generate one on the API keys page
- Add credits or a provider key — Add credits on the Credits page or connect your own provider key on the Integrations page
Use AI
Add the Docs MCP to your AI coding tool to get help building with Respan. No API key needed.
This section is for Respan LLM gateway users.
Use Respan Gateway to route requests to your own self-hosted or custom LLM provider while keeping unified observability (logs, cost, latency, and reliability metrics) in Respan.
Prerequisites
- A Respan API key
- A running LLM endpoint that exposes an OpenAI-compatible API
Setup
Create a custom provider
Go to Settings > Integrations on platform.respan.ai and click Add Custom Provider.
Provide:
- Provider name — A display name (e.g. “My vLLM Server”)
- Base URL — The endpoint of your LLM server (e.g.
https://my-vllm.example.com/v1) - API key — The authentication key for your server (if required)
API
You can also manage custom providers and models programmatically: