Skip to content

pydantic/logfire-vercel-ai-demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Vercel AI SDK + Logfire Demo

A minimal Next.js chat app instrumented with OpenTelemetry, sending traces to Pydantic Logfire.

Companion repo for the blog post: Vercel AI SDK and Logfire: what happens when everyone speaks OpenTelemetry

What this demonstrates

  • @vercel/otel for OTel instrumentation in Next.js
  • Vercel AI SDK's experimental_telemetry for LLM call tracing
  • Streaming chat with streamText and useChat
  • Standard OTLP export to Logfire (works with any OTLP-compatible backend)

Setup

  1. Install dependencies:
npm install
  1. Copy .env.local.example to .env.local and fill in your keys:
cp .env.local.example .env.local

You'll need:

  • An OpenAI API key (OPENAI_API_KEY)
  • A Logfire write token
  1. Run the dev server:
npm run dev

Open http://localhost:3000 and send a message. Traces will appear in your Logfire project.

Project structure

src/
  instrumentation.ts      # OTel setup (3 lines)
  app/
    page.tsx              # Chat UI (useChat hook)
    api/chat/route.ts     # Streaming endpoint with experimental_telemetry
    layout.tsx            # Root layout

About

Minimal Next.js chat app with Vercel AI SDK, instrumented with OpenTelemetry and sending traces to Pydantic Logfire

Resources

Security policy

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors