Skip to main content
  1. Sign up — Create an account at platform.respan.ai
  2. Create an API key — Generate one on the API keys page
  3. Add credits or a provider key — Add credits on the Credits page or connect your own provider key on the Integrations page

What is LangGraph?

LangGraph is a Python framework for building stateful, multi-step agent workflows as graphs. The Respan integration uses the Respan tracing SDK to capture your graph execution. Each node and LLM call is recorded as a span so you can see the full workflow in chronological order.

Setup

1

Install packages

pip install langgraph langchain-anthropic respan-tracing
2

Set environment variables

.env
ANTHROPIC_API_KEY=your-anthropic-api-key
RESPAN_API_KEY=your-respan-api-key
3

Initialize Respan tracing

from respan_tracing.main import RespanTelemetry
from respan_tracing.decorators import task, workflow

ktl = RespanTelemetry()
4

Build your graph with traced nodes

from typing import Annotated
from typing_extensions import TypedDict
from langgraph.graph import StateGraph, START, END
from langgraph.graph.message import add_messages
from langchain_anthropic import ChatAnthropic

class State(TypedDict):
    messages: Annotated[list, add_messages]

llm = ChatAnthropic(model="claude-3-5-sonnet-20240620", temperature=0)

@task(name="chatbot_response")
def chatbot_respond(state: State):
    return {"messages": [llm.invoke(state["messages"])]}

graph_builder = StateGraph(State)
graph_builder.add_node("chatbot", chatbot_respond)
graph_builder.add_edge(START, "chatbot")
graph_builder.add_edge("chatbot", END)
graph = graph_builder.compile()
5

Run the graph inside a workflow

def stream_graph_updates(user_input: str):
    for event in graph.stream({"messages": [{"role": "user", "content": user_input}]}):
        for value in event.values():
            print("Assistant:", value["messages"][-1].content)

@workflow(name="chatbot_qa")
def chatbot_qa():
    while True:
        try:
            user_input = input("User: ")
            if user_input.lower() in ["quit", "exit", "q"]:
                print("Goodbye!")
                break
            stream_graph_updates(user_input)
        except:
            user_input = "What do you know about LangGraph?"
            print("User: " + user_input)
            stream_graph_updates(user_input)
            break

if __name__ == "__main__":
    chatbot_qa()
6

View your trace

Open the Traces page in the Respan dashboard.

Observability

With this integration, Respan auto-captures:
  • Graph execution — the full workflow as a trace
  • Node calls — each graph node as a span
  • LLM calls — model, input/output messages, token usage
  • Errors — failed nodes and error details
View traces on the Traces page.