Langflow

Langflow is a visual framework by DataStax for building multi-agent and RAG applications. It provides a drag-and-drop interface for composing LLM pipelines with components for models, prompts, tools, and data sources. Built on LangChain, every flow execution is auto-traced by Respan, with full observability over component runs and LLM calls — and gateway routing through the OpenAI-compatible Respan endpoint.

Create an account at platform.respan.ai and grab an API key. For gateway, also add credits or a provider key.

Run npx @respan/cli setup to set up with your coding agent.

Setup

Langflow is built on LangChain, so tracing uses the LangChain instrumentor.

1

Install packages

$pip install respan-ai openinference-instrumentation-langchain langflow
2

Set environment variables

$export OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
$export RESPAN_API_KEY="YOUR_RESPAN_API_KEY"

OPENAI_API_KEY is used for LLM requests. RESPAN_API_KEY is used to export traces to Respan.

3

Initialize and run

1from langflow.load import run_flow_from_json
2from respan import Respan
3from openinference.instrumentation.langchain import LangChainInstrumentor
4
5respan = Respan(instrumentations=[LangChainInstrumentor()])
6
7result = run_flow_from_json(
8 flow="path/to/your/flow.json",
9 input_value="What is the meaning of life?",
10)
11print(result)
12respan.flush()
4

View your trace

Open the Traces page to see your Langflow workflow with component-level operations, LLM calls, and pipeline execution.

Configuration

ParameterTypeDefaultDescription
api_keystr | NoneNoneFalls back to RESPAN_API_KEY env var.
base_urlstr | NoneNoneFalls back to RESPAN_BASE_URL env var.
instrumentationslist[]Plugin instrumentations to activate (e.g. LangChainInstrumentor()).
customer_identifierstr | NoneNoneDefault customer identifier for all spans.
metadatadict | NoneNoneDefault metadata attached to all spans.
environmentstr | NoneNoneEnvironment tag (e.g. "production").

Attributes

In Respan()

Set defaults at initialization — these apply to all spans.

1from respan import Respan
2from openinference.instrumentation.langchain import LangChainInstrumentor
3
4respan = Respan(
5 instrumentations=[LangChainInstrumentor()],
6 customer_identifier="user_123",
7 metadata={"service": "langflow-api", "version": "1.0.0"},
8)

With propagate_attributes

Override per-request using a context scope.

1from langflow.load import run_flow_from_json
2from respan import Respan, propagate_attributes
3from openinference.instrumentation.langchain import LangChainInstrumentor
4
5respan = Respan(instrumentations=[LangChainInstrumentor()])
6
7def handle_request(user_id: str, question: str):
8 with propagate_attributes(
9 customer_identifier=user_id,
10 thread_identifier="conv_abc_123",
11 metadata={"plan": "pro"},
12 ):
13 result = run_flow_from_json(flow="flow.json", input_value=question)
14 print(result)
AttributeTypeDescription
customer_identifierstrIdentifies the end user in Respan analytics.
thread_identifierstrGroups related messages into a conversation.
metadatadictCustom key-value pairs. Merged with default metadata.