Ingest traces from logs

Ingest a batch of spans to construct one or more traces. Use this to import historical data or programmatically build traces when SDK instrumentation isn't feasible. <Note> If you're starting fresh, we recommend using an SDK/integration (e.g., OpenAI Agents SDK) to capture traces automatically. This endpoint is best for bulk import and migration workflows. </Note> <Tip>Example project: [Github Link](https://github.com/respanai/respan-example-projects/tree/main/python/tracing/respan-tracing-sdk/logs-to-trace)</Tip> ## Body - `body` *array* **required**: Array of span log objects. Each object corresponds to a span within a trace. Spans with the same `trace_unique_id` are grouped into a single trace. Parent-child relationships are inferred via `span_parent_id`. Aligns with the sample payload in the logs_to_trace example. - `trace_unique_id` *string* **required**: Unique identifier for the trace. All spans with this value are grouped together. - `span_unique_id` *string* **required**: Unique identifier for the span. - `span_parent_id` *string*: Parent span ID; omit or set null for root spans. - `span_name` *string*: Name of the span (e.g., "openai.chat", "workflow.start"). - `span_workflow_name` *string*: Nearest parent workflow name. - `span_path` *string*: Nested path within the workflow (e.g., "joke_creation.store_joke"). - `start_time` *string*: RFC3339 UTC start timestamp. - `timestamp` *string*: RFC3339 UTC end/event timestamp. - `latency` *number*: Latency in seconds for the span operation. - `input` *string*: Raw input string or JSON serialized string used by the span. - `output` *string*: Raw output string or JSON serialized string produced by the span. - `model` *string*: Model name used by the span (e.g., "gpt-3.5-turbo", "gpt-4o-mini"). - `encoding_format` *string*: Embedding encoding format for spans that generate embeddings (e.g., "float"). - `provider_id` *string*: LLM or service provider ID (e.g., "openai"). - `prompt_tokens` *integer*: Number of prompt tokens used (if applicable). - `completion_tokens` *integer*: Number of completion tokens used (if applicable). - `cost` *float*: Cost associated with the span (if applicable). - `metadata` *object*: Custom attributes as a key-value object. - `warnings` *string*: Warnings or notes captured during span execution. - `disable_log` *boolean*: Set true to disable logging for the span in observability system. - `disable_fallback` *boolean*: Disable fallback behavior for the span if supported. - `respan_params` *object*: Additional Respan parameters (e.g., has_webhook, environment). - `temperature` *number*: Controls randomness for LLM spans; typical range 0.0–1.0. - `presence_penalty` *number*: Presence penalty parameter used in some LLM requests. - `frequency_penalty` *number*: Frequency penalty parameter used in some LLM requests. - `max_tokens` *integer*: Maximum tokens requested for completion/embedding generations. - `stream` *boolean*: Whether streaming was enabled for the span. - `prompt_messages` *array*: Array of messages sent to the LLM (each with role and content). Present for chat spans. - `completion_message` *object*: Assistant message returned by the LLM (role/content). Present for chat spans. ```python Python URL = "https://api.respan.ai/v1/traces/ingest" headers = { "Authorization": f"Bearer {YOUR_RESPAN_API_KEY}", "Content-Type": "application/json", } payload = [ { "trace_unique_id": "a-trace-id", "span_unique_id": "root-span-id", "span_name": "pirate_joke_plus_audience_reactions.workflow", "span_parent_id": None, "timestamp": "2025-09-08T07:46:19.041835Z", "start_time": "2025-09-08T07:46:14.007279Z", "span_workflow_name": "pirate_joke_plus_audience_reactions", "span_path": "", "provider_id": "", "model": "python", "input": "{\"args\": [], \"kwargs\": {}}", "output": "\"python\"", "encoding_format": "float", "latency": 5.034556, "respan_params": {"has_webhook": false, "environment": "prod"}, "disable_log": false }, { "trace_unique_id": "a-trace-id", "span_unique_id": "child-span-id", "span_name": "openai.chat", "span_parent_id": "root-span-id", "timestamp": "2025-09-08T07:46:14.617987Z", "start_time": "2025-09-08T07:46:14.007452Z", "span_workflow_name": "pirate_joke_generator", "span_path": "joke_creation", "provider_id": "openai", "model": "gpt-3.5-turbo", "input": "[{\"role\": \"assistant\", \"content\": \"Why did the opentelemetry developer go broke?\\n\\n\"}, {\"role\": \"user\", \"content\": \"Tell me a joke about opentelemetry\"}]", "output": "{\"role\": \"assistant\", \"content\": \"Why did the opentelemetry developer go broke?\\n\\n\"}", "prompt_messages": [ {"role": "assistant", "content": "Why did the opentelemetry developer go broke?\\n\\n"}, {"role": "user", "content": "Tell me a joke about opentelemetry"} ], "completion_message": {"role": "assistant", "content": "Why did the opentelemetry developer go broke?\\n\\n"}, "encoding_format": "float", "prompt_tokens": 15, "completion_tokens": 10, "cost": 2.25e-05, "latency": 0.610535, "respan_params": {"has_webhook": false, "environment": "prod"}, "disable_log": false } ] resp = requests.post(URL, json=payload, headers=headers) print(resp.status_code) print(resp.text) ``` ```typescript TypeScript const URL = "https://api.respan.ai/v1/traces/ingest"; const headers = { Authorization: `Bearer ${YOUR_RESPAN_API_KEY}`, "Content-Type": "application/json", }; const payload = [ { trace_unique_id: "a-trace-id", span_unique_id: "root-span-id", span_name: "pirate_joke_plus_audience_reactions.workflow", span_parent_id: null, timestamp: "2025-09-08T07:46:19.041835Z", start_time: "2025-09-08T07:46:14.007279Z", span_workflow_name: "pirate_joke_plus_audience_reactions", span_path: "", provider_id: "", model: "python", input: "{\"args\": [], \"kwargs\": {}}", output: "\"python\"", encoding_format: "float", latency: 5.034556, respan_params: { has_webhook: false, environment: "prod" }, disable_log: false }, { trace_unique_id: "a-trace-id", span_unique_id: "child-span-id", span_name: "openai.chat", span_parent_id: "root-span-id", timestamp: "2025-09-08T07:46:14.617987Z", start_time: "2025-09-08T07:46:14.007452Z", span_workflow_name: "pirate_joke_generator", span_path: "joke_creation", provider_id: "openai", model: "gpt-3.5-turbo", input: "[{\"role\": \"assistant\", \"content\": \"Why did the opentelemetry developer go broke?\\n\\n\"}, {\"role\": \"user\", \"content\": \"Tell me a joke about opentelemetry\"}]", output: "{\"role\": \"assistant\", \"content\": \"Why did the opentelemetry developer go broke?\\n\\n\"}", prompt_messages: [ { role: "assistant", content: "Why did the opentelemetry developer go broke?\\n\\n" }, { role: "user", content: "Tell me a joke about opentelemetry" } ], completion_message: { role: "assistant", content: "Why did the opentelemetry developer go broke?\\n\\n" }, encoding_format: "float", prompt_tokens: 15, completion_tokens: 10, cost: 2.25e-05, latency: 0.610535, respan_params: { has_webhook: false, environment: "prod" }, disable_log: false } ]; fetch(URL, { method: "POST", headers, body: JSON.stringify(payload), }) .then((r) => r.json()) .then((d) => console.log(d)) .catch((e) => console.error(e)); ``` ```bash cURL curl -X POST "https://api.respan.ai/v1/traces/ingest" \ -H "Authorization: Bearer YOUR_RESPAN_API_KEY" \ -H "Content-Type: application/json" \ -d '[ { "trace_unique_id": "a-trace-id", "span_unique_id": "root-span-id", "span_name": "pirate_joke_plus_audience_reactions.workflow", "span_parent_id": null, "timestamp": "2025-09-08T07:46:19.041835Z", "start_time": "2025-09-08T07:46:14.007279Z", "span_workflow_name": "pirate_joke_plus_audience_reactions", "span_path": "", "provider_id": "", "model": "python", "input": "{\"args\": [], \"kwargs\": {}}", "output": "\"python\"", "encoding_format": "float", "latency": 5.034556, "respan_params": { "has_webhook": false, "environment": "prod" }, "disable_log": false }, { "trace_unique_id": "a-trace-id", "span_unique_id": "child-span-id", "span_name": "openai.chat", "span_parent_id": "root-span-id", "timestamp": "2025-09-08T07:46:14.617987Z", "start_time": "2025-09-08T07:46:14.007452Z", "span_workflow_name": "pirate_joke_generator", "span_path": "joke_creation", "provider_id": "openai", "model": "gpt-3.5-turbo", "input": "[{\"role\": \"assistant\", \"content\": \"Why did the opentelemetry developer go broke?\\n\\n\"}, {\"role\": \"user\", \"content\": \"Tell me a joke about opentelemetry\"}]", "output": "{\"role\": \"assistant\", \"content\": \"Why did the opentelemetry developer go broke?\\n\\n\"}", "prompt_messages": [ { "role": "assistant", "content": "Why did the opentelemetry developer go broke?\\n\\n" }, { "role": "user", "content": "Tell me a joke about opentelemetry" } ], "completion_message": { "role": "assistant", "content": "Why did the opentelemetry developer go broke?\\n\\n" }, "encoding_format": "float", "prompt_tokens": 15, "completion_tokens": 10, "cost": 0.0000225, "latency": 0.610535, "respan_params": { "has_webhook": false, "environment": "prod" }, "disable_log": false } ]' ``` ```json { "status": "ok", "ingested_spans": 2, "created_traces": 1, "trace_ids": ["a-trace-id"], "errors": [] } ``` <Note> Prerequisites for successful ingestion: - Accurate timestamps for each span (start_time and/or timestamp) to preserve relative timing within a trace. - Properly assigned trace and span IDs that reflect the correct parent-child relationships (trace_unique_id groups spans into a single trace; span_parent_id links children to their parents). </Note>

Authentication

AuthorizationBearer
API key authentication. Get your API key from https://platform.respan.ai/platform/api-keys

Request

This endpoint expects an object.

Response

Successful response for Ingest traces from logs
statusstring
ingested_spansinteger
created_tracesinteger
trace_idslist of strings
errorslist of strings

Errors

401
Unauthorized Error