BeeAI

BeeAI is an open-source framework for building, deploying, and serving production-ready AI agents. It provides a structured approach to agent development with built-in memory, tools, and LLM integration. Respan gives you full observability over every agent run, tool call, and LLM generation — and gateway routing through the OpenAI-compatible Respan endpoint.

Create an account at platform.respan.ai and grab an API key. For gateway, also add credits or a provider key.

Run npx @respan/cli setup to set up with your coding agent.

Setup

1

Install packages

$pip install respan-ai openinference-instrumentation-beeai beeai-framework
2

Set environment variables

$export OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
$export RESPAN_API_KEY="YOUR_RESPAN_API_KEY"

OPENAI_API_KEY is used for LLM requests. RESPAN_API_KEY is used to export traces to Respan.

3

Initialize and run

1from beeai import Agent
2from respan import Respan
3from openinference.instrumentation.beeai import BeeAIInstrumentor
4
5respan = Respan(instrumentations=[BeeAIInstrumentor()])
6
7agent = Agent(
8 model="gpt-4.1-nano",
9 instructions="You are a helpful research assistant.",
10)
11
12response = agent.run("Explain the benefits of AI observability in production.")
13print(response)
14respan.flush()
4

View your trace

Open the Traces page to see your BeeAI agent trace with execution loops, tool calls, and LLM generations.

Configuration

ParameterTypeDefaultDescription
api_keystr | NoneNoneFalls back to RESPAN_API_KEY env var.
base_urlstr | NoneNoneFalls back to RESPAN_BASE_URL env var.
instrumentationslist[]Plugin instrumentations to activate (e.g. BeeAIInstrumentor()).
customer_identifierstr | NoneNoneDefault customer identifier for all spans.
metadatadict | NoneNoneDefault metadata attached to all spans.
environmentstr | NoneNoneEnvironment tag (e.g. "production").

Attributes

In Respan()

Set defaults at initialization — these apply to all spans.

1from respan import Respan
2from openinference.instrumentation.beeai import BeeAIInstrumentor
3
4respan = Respan(
5 instrumentations=[BeeAIInstrumentor()],
6 customer_identifier="user_123",
7 metadata={"service": "beeai-api", "version": "1.0.0"},
8)

With propagate_attributes

Override per-request using a context scope.

1from respan import Respan, propagate_attributes
2from openinference.instrumentation.beeai import BeeAIInstrumentor
3
4respan = Respan(instrumentations=[BeeAIInstrumentor()])
5
6def handle_request(user_id: str, message: str):
7 with propagate_attributes(
8 customer_identifier=user_id,
9 thread_identifier="conv_abc_123",
10 metadata={"plan": "pro"},
11 ):
12 print(agent.run(message))
AttributeTypeDescription
customer_identifierstrIdentifies the end user in Respan analytics.
thread_identifierstrGroups related messages into a conversation.
metadatadictCustom key-value pairs. Merged with default metadata.