Skip to main content
  1. Sign up — Create an account at platform.respan.ai
  2. Create an API key — Generate one on the API keys page
  3. Add credits or a provider key — Add credits on the Credits page or connect your own provider key on the Integrations page
Add the Docs MCP to your AI coding tool to get help building with Respan. No API key needed.
{
  "mcpServers": {
    "respan-docs": {
      "url": "https://respan.ai/docs/mcp"
    }
  }
}
This integration is for the Respan gateway.

What is LangChain?

LangChain provides a powerful framework for building applications with language models. You can seamlessly integrate Respan with LangChain’s ChatOpenAI LLM with minimal code changes.

Quickstart

Step 1: Install LangChain

pip install langchain-openai

Step 2: Initialize LangChain with Respan

from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.respan.ai/api/",
    api_key="<Your Respan API Key>",
    model="gpt-3.5-turbo",
    streaming=True,
)

Step 3: Make Your First Request

response = llm.invoke("Hello, world!")
print(response)

Switch models

# OpenAI GPT models
model = "gpt-4o"
# model = "claude-3-5-sonnet-20241022"
# model = "gemini-1.5-pro"

llm = ChatOpenAI(
    base_url="https://api.respan.ai/api/",
    api_key="<Your Respan API Key>",
    model=model,
)
See the full model list for all available models.

Supported parameters

OpenAI parameters

We support all the OpenAI parameters. You can pass them directly in the LangChain configuration.
llm = ChatOpenAI(
    base_url="https://api.respan.ai/api/",
    api_key="<Your Respan API Key>",
    model="gpt-4o-mini",
    temperature=0.7,          # Control randomness
    max_tokens=1000,          # Limit response length
    streaming=True,           # Enable streaming
)

Respan Parameters

Respan parameters can be passed using extra_body for better handling and customization.
llm = ChatOpenAI(
    base_url="https://api.respan.ai/api/",
    api_key="<Your Respan API Key>",
    model="gpt-4o-mini",
    extra_body={
        "customer_identifier": "user_123",           # Track specific users
        "fallback_models": ["gpt-3.5-turbo"],       # Automatic fallbacks
        "metadata": {"session_id": "abc123"},        # Custom metadata
        "thread_identifier": "conversation_456",     # Group related messages
        "group_identifier": "team_alpha",           # Organize by groups
    }
)

Advanced Usage

Using with Chains

from langchain.chains import ConversationChain
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    base_url="https://api.respan.ai/api/",
    api_key="<Your Respan API Key>",
    model="gpt-4o-mini",
)

chain = ConversationChain(llm=llm)
response = chain.run("Tell me about artificial intelligence")
print(response)

View your analytics

Access your Respan dashboard to see detailed analytics

Next Steps

User Management

Track user behavior and patterns

Prompt Management

Manage and version your prompts