LangChain

Use LangChain with Respan
  1. Sign up — Create an account at platform.respan.ai
  2. Create an API key — Generate one on the API keys page
  3. Add credits or a provider key — Add credits on the Credits page or connect your own provider key on the Integrations page

Add the Docs MCP to your AI coding tool to get help building with Respan. No API key needed.

1{
2 "mcpServers": {
3 "respan-docs": {
4 "url": "https://docs.respan.ai/mcp"
5 }
6 }
7}
This integration is for the Respan gateway.

What is LangChain?

LangChain provides a powerful framework for building applications with language models. You can seamlessly integrate Respan with LangChain’s ChatOpenAI LLM with minimal code changes.

Quickstart

Step 1: Install LangChain

$pip install langchain-openai

Step 2: Initialize LangChain with Respan

1from langchain_openai import ChatOpenAI
2
3llm = ChatOpenAI(
4 base_url="https://api.respan.ai/api/",
5 api_key="<Your Respan API Key>",
6 model="gpt-3.5-turbo",
7 streaming=True,
8)

Step 3: Make Your First Request

1response = llm.invoke("Hello, world!")
2print(response)

Switch models

1# OpenAI GPT models
2model = "gpt-4o"
3# model = "claude-3-5-sonnet-20241022"
4# model = "gemini-1.5-pro"
5
6llm = ChatOpenAI(
7 base_url="https://api.respan.ai/api/",
8 api_key="<Your Respan API Key>",
9 model=model,
10)

See the full model list for all available models.

Supported parameters

OpenAI parameters

We support all the OpenAI parameters. You can pass them directly in the LangChain configuration.

1llm = ChatOpenAI(
2 base_url="https://api.respan.ai/api/",
3 api_key="<Your Respan API Key>",
4 model="gpt-4o-mini",
5 temperature=0.7, # Control randomness
6 max_tokens=1000, # Limit response length
7 streaming=True, # Enable streaming
8)

Respan Parameters

Respan parameters can be passed using extra_body for better handling and customization.

1llm = ChatOpenAI(
2 base_url="https://api.respan.ai/api/",
3 api_key="<Your Respan API Key>",
4 model="gpt-4o-mini",
5 extra_body={
6 "customer_identifier": "user_123", # Track specific users
7 "fallback_models": ["gpt-3.5-turbo"], # Automatic fallbacks
8 "metadata": {"session_id": "abc123"}, # Custom metadata
9 "thread_identifier": "conversation_456", # Group related messages
10 "group_identifier": "team_alpha", # Organize by groups
11 }
12)

Advanced Usage

Using with Chains

1from langchain.chains import ConversationChain
2from langchain_openai import ChatOpenAI
3
4llm = ChatOpenAI(
5 base_url="https://api.respan.ai/api/",
6 api_key="<Your Respan API Key>",
7 model="gpt-4o-mini",
8)
9
10chain = ConversationChain(llm=llm)
11response = chain.run("Tell me about artificial intelligence")
12print(response)

Next Steps