Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.roark.ai/llms.txt

Use this file to discover all available pages before exploring further.

Overview

Roark supports OpenTelemetry (OTel) tracing. Send your OTel trace data to Roark to see what happens under the hood of your Voice AI agent, debug latency, inspect tool usage and external API calls, and correlate call execution with your backend traces.

Features

  • LLM traces per call — View LLM spans, tool calls, and model invocations directly on the call detail page. Each call’s Tracing tab shows the full trace tree for that conversation, so you can quickly pinpoint where latency or failures occurred.
  • Central trace explorer — See all traces in one place. Filter by time range, custom tags, or search by span name and attributes. Use this to spot patterns across calls, compare runs, and troubleshoot recurring issues.

Roark Traces view showing agent turns with STT, LLM, and TTS spans

Endpoint and Protocol

RequirementDetails
ProtocolOTLP over HTTPS only
EndpointSend traces to Roark’s OTel endpoint (see examples below)
OTel traces URLhttps://api.roark.ai/v1/traces
AuthorizationRequired. Send Authorization: Bearer YOUR_ROARK_API_KEY in the request headers.
RoleRoark acts as an OTel Collector
All trace ingestion requests require authentication. Generate an API key in your Roark dashboard and use it in the Authorization: Bearer YOUR_ROARK_API_KEY header.

Setup Guide

1

Get an API key

Generate an API key in your Roark dashboard. Trace ingestion requires authentication via the Authorization: Bearer YOUR_ROARK_API_KEY header. Create an API key →
2

Choose your integration

Pick the platform you’re using and follow the corresponding setup section below:
  • LiveKit — instrument your LiveKit agent with OpenTelemetry
  • VAPI — traces sync automatically when calls are ingested
  • Custom integration — any other platform via OTLP HTTP

LiveKit

Instrument your LiveKit agent with OpenTelemetry and export traces to Roark. Configure your tracer with the livekit.room.id resource attribute — this is how Roark links your LiveKit room to its traces and shows them on the call detail page. See the LiveKit integration guide for the full webhook setup. Install the required packages:
pip install opentelemetry-api opentelemetry-sdk opentelemetry-exporter-otlp-proto-http livekit-agents
Example: a simple LiveKit Agent entrypoint with OTel exporting to Roark:
import os
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
from opentelemetry.sdk.resources import Resource
from opentelemetry.sdk.trace import TracerProvider
from opentelemetry.sdk.trace.export import BatchSpanProcessor
from livekit.agents import AgentServer, JobContext, cli
from livekit.agents.telemetry import set_tracer_provider


def setup_roark_tracer(ctx: JobContext):
    """Configure OTel to export traces to Roark with livekit.room.id for call correlation."""
    resource = Resource.create({
        "livekit.room.id": ctx.job.room.sid,  # Required: Roark uses this to link traces to the call
        "roark.skip": False,                   # Optional: set to True to filter out these traces
    })
    provider = TracerProvider(resource=resource)
    provider.add_span_processor(
        BatchSpanProcessor(
            OTLPSpanExporter(
                endpoint="https://api.roark.ai/v1/traces",
                headers={"Authorization": f"Bearer {os.environ['ROARK_API_KEY']}"},
            )
        )
    )
    set_tracer_provider(provider)

    # Ensure all pending traces are exported when the agent shuts down
    async def flush_traces():
        provider.force_flush()

    ctx.add_shutdown_callback(flush_traces)


server = AgentServer()


@server.rtc_session(agent_name="my-agent")
async def entrypoint(ctx: JobContext):
    # Initialize tracing early, before starting the agent session
    setup_roark_tracer(ctx)

    # Your agent logic here
    pass


if __name__ == "__main__":
    cli.run_app(server)
Resource attributes:
AttributeRequiredDescription
livekit.room.idYesThe LiveKit room SID (ctx.job.room.sid). Roark uses this to link traces to the corresponding call.
roark.skipNoSet to true to tell Roark to filter out traces for this room. Useful for skipping test or internal calls.
If you’re also using the LiveKit webhook integration, you can set roark.skip in both room metadata and OTel resource attributes to skip both call processing and trace ingestion.

VAPI

Traces sync automatically. Whenever a call from a selected agent is synced to Roark, its OpenTelemetry traces are synced too. No extra OTLP exporter or instrumentation is needed on your side. See the VAPI integration guide for step-by-step setup.
Make sure Public Logs are enabled in your Vapi dashboard. Roark requires public log access to ingest trace data from Vapi calls.

Custom Integration

For any other platform, use the OTLP HTTP trace exporter and point it to Roark’s OTel endpoint. Include the required Authorization: Bearer YOUR_ROARK_API_KEY header and attach the required resource attributes.

Correlating traces to a call or chat

If you submit calls or chats to Roark via the Customer API, tag your traces with roark.external_id so Roark can link each conversation to its trace automatically.
  1. When creating a call or chat via POST /call or POST /chat, supply the externalId field with a stable identifier from your own system (session ID, conversation ID, etc.). It must be unique within the project.
  2. On your OpenTelemetry traces, set roark.external_id to the same value — either as a resource attribute (propagates to every span in the service) or as a span attribute on the root span.
Roark looks up the matching trace in ClickHouse after the call/chat is created and backfills the trace ID, so the conversation appears in its Tracing tab automatically.
npm install @opentelemetry/sdk-trace-node @opentelemetry/exporter-trace-otlp-http @opentelemetry/resources
import { NodeTracerProvider } from "@opentelemetry/sdk-trace-node";
import { BatchSpanProcessor } from "@opentelemetry/sdk-trace-base";
import { OTLPTraceExporter } from "@opentelemetry/exporter-trace-otlp-http";
import { Resource } from "@opentelemetry/resources";

const exporter = new OTLPTraceExporter({
  url: "https://api.roark.ai/v1/traces",
  headers: { Authorization: `Bearer ${process.env.ROARK_API_KEY}` },
});

// `roark.external_id` is the correlation key linking this trace to the
// call/chat you create with the same `externalId` via the Customer API.
const resource = new Resource({
  "roark.external_id": yourSessionId,
  "roark.project.tag.env": process.env.ROARK_ENV ?? "production",
});

const provider = new NodeTracerProvider({ resource });
provider.addSpanProcessor(new BatchSpanProcessor(exporter));
provider.register();
Resource attributes:
AttributeRequiredDescription
roark.external_idRecommendedA stable identifier from your own system (e.g. session/conversation ID). Must match the externalId field you submit on the corresponding call or chat via the Customer API, and must be unique within a project.
roark.project.tag.{key}NoCustom project tags for filtering and grouping in the trace explorer.

Once your integration is set up, you’re ready to send traces to Roark. You’ll be able to view them on the Traces page and directly within each call’s detail page.

What’s Next

Live monitoring

View and debug active calls

Metrics & reports

Define and analyze call metrics

Dashboards

Build observability views

API reference

Explore Roark API endpoints

Related: LiveKit integration · VAPI integration · Integrations overview