AutoGen

Trace AutoGen multi-agent conversations with Respan.

  1. Sign up — Create an account at platform.respan.ai
  2. Create an API key — Generate one on the API keys page
  3. Add credits or a provider key — Add credits on the Credits page or connect your own provider key on the Integrations page

Add the Docs MCP to your AI coding tool to get help building with Respan. No API key needed.

1{
2 "mcpServers": {
3 "respan-docs": {
4 "url": "https://docs.respan.ai/mcp"
5 }
6 }
7}

What is AutoGen?

AutoGen is Microsoft’s framework for building multi-agent conversational AI systems. It enables the creation of agents that can converse with each other to solve tasks, with support for human-in-the-loop patterns.

Setup

1

Install packages

$pip install respan-ai openinference-instrumentation-autogen-agentchat pyautogen
2

Set environment variables

$export RESPAN_API_KEY="YOUR_RESPAN_API_KEY"
$export OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
3

Initialize and run

1import os
2from dotenv import load_dotenv
3
4load_dotenv()
5
6from respan import Respan
7from respan_instrumentation_openinference import OpenInferenceInstrumentor
8from openinference_instrumentation_autogen_agentchat import AutoGenInstrumentor
9import autogen
10
11# Initialize Respan with AutoGen instrumentation
12respan = Respan(
13 instrumentations=[
14 OpenInferenceInstrumentor(instrumentor=AutoGenInstrumentor())
15 ]
16)
17
18# Configure the LLM
19config_list = [{"model": "gpt-4o-mini", "api_key": os.getenv("OPENAI_API_KEY")}]
20
21# Create an assistant agent
22assistant = autogen.AssistantAgent(
23 name="assistant",
24 llm_config={"config_list": config_list},
25 system_message="You are a helpful AI assistant.",
26)
27
28# Create a user proxy agent
29user_proxy = autogen.UserProxyAgent(
30 name="user_proxy",
31 human_input_mode="NEVER",
32 max_consecutive_auto_reply=3,
33 code_execution_config={"work_dir": "coding", "use_docker": False},
34)
35
36# Start the conversation
37user_proxy.initiate_chat(
38 assistant,
39 message="Write a Python function to calculate the fibonacci sequence.",
40)
41
42respan.flush()
4

View your trace

Open the Traces page to see your AutoGen conversation with agent messages, LLM calls, and code execution.

What gets traced

All AutoGen operations are auto-instrumented:

  • Multi-agent conversations and message passing
  • LLM calls with model, tokens, and input/output
  • Code generation and execution
  • Tool and function calls
  • Agent-to-agent delegation

Traces appear in the Traces dashboard.

Learn more