A minimal Next.js chat app instrumented with OpenTelemetry, sending traces to Pydantic Logfire.
Companion repo for the blog post: Vercel AI SDK and Logfire: what happens when everyone speaks OpenTelemetry
@vercel/otelfor OTel instrumentation in Next.js- Vercel AI SDK's
experimental_telemetryfor LLM call tracing - Streaming chat with
streamTextanduseChat - Standard OTLP export to Logfire (works with any OTLP-compatible backend)
- Install dependencies:
npm install- Copy
.env.local.exampleto.env.localand fill in your keys:
cp .env.local.example .env.localYou'll need:
- An OpenAI API key (
OPENAI_API_KEY) - A Logfire write token
- Run the dev server:
npm run devOpen http://localhost:3000 and send a message. Traces will appear in your Logfire project.
src/
instrumentation.ts # OTel setup (3 lines)
app/
page.tsx # Chat UI (useChat hook)
api/chat/route.ts # Streaming endpoint with experimental_telemetry
layout.tsx # Root layout