The Respan LangChain integration routes LLM calls through the Respan gateway using LangChain's ChatOpenAI class. Change the base URL and API key - your existing chains, agents, and tools continue to work unchanged.
Access 250+ models through a single integration. Switch between GPT-4o, Claude 3.5 Sonnet, Gemini 1.5 Pro, and more by changing the model parameter. The gateway automatically logs all requests.
Use extra_body to pass Respan-specific parameters like customer_identifier for user tracking, fallback_models for automatic failover, and metadata for custom tagging across all your LangChain workflows.
Install langchain-openai and configure ChatOpenAI with the Respan gateway base URL (https://api.keywordsai.co/api/) and your Respan API key.
The gateway proxies requests to the underlying provider, logs everything, and returns responses in the standard format. All LangChain features - chains, agents, tools, streaming - work natively.
Pass Keywords AI parameters through model_kwargs or extra_body for user tracking, fallback routing, and custom metadata. Build complex multi-turn dialogue systems with full observability.
python
from langchain_openai import ChatOpenAI
llm = ChatOpenAI(
base_url="https://api.keywordsai.co/api/",
api_key="YOUR_RESPAN_API_KEY",
model="gpt-4o",
streaming=True,
model_kwargs={
"extra_body": {
"customer_identifier": "user-123",
"fallback_models": ["claude-3-5-sonnet-20241022"],
}
},
)
response = llm.invoke("What is the meaning of life?")