Opik TypeScript and JavaScript SDK integration with Vercel AI SDK
npm install opik-vercel

Seamlessly integrate Opik observability with your Vercel AI SDK applications to trace, monitor, and debug your AI workflows.
- 🔍 Comprehensive Tracing: Automatically trace AI SDK calls and completions
- 📊 Hierarchical Visualization: View your AI execution as a structured trace with parent-child relationships
- 📝 Detailed Metadata Capture: Record model names, prompts, completions, token usage, and custom metadata
- 🚨 Error Handling: Capture and visualize errors in your AI API interactions
- 🏷️ Custom Tagging: Add custom tags to organize and filter your traces
- 🔄 Streaming Support: Full support for streamed completions and chat responses
``bash`
npm install opik-vercel ai @ai-sdk/openai @opentelemetry/sdk-node @opentelemetry/auto-instrumentations-node
`bash`
npm install opik-vercel @vercel/otel @opentelemetry/api-logs @opentelemetry/instrumentation @opentelemetry/sdk-logs
- Node.js ≥ 18
- Vercel AI SDK (ai ≥ 3.0.0)
- Opik SDK (automatically installed as a peer dependency)
- OpenTelemetry packages (see installation commands above)
Set your environment variables:
`bash`
OPIK_API_KEY="
OPIK_URL_OVERRIDE="https://www.comet.com/opik/api" # Cloud version
OPIK_PROJECT_NAME="
OPIK_WORKSPACE="
OPENAI_API_KEY="
`typescript
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";
import { NodeSDK } from "@opentelemetry/sdk-node";
import { getNodeAutoInstrumentations } from "@opentelemetry/auto-instrumentations-node";
import { OpikExporter } from "opik-vercel";
const sdk = new NodeSDK({
traceExporter: new OpikExporter(),
instrumentations: [getNodeAutoInstrumentations()],
});
sdk.start();
async function main() {
const result = await generateText({
model: openai("gpt-4o"),
maxTokens: 50,
prompt: "What is love?",
experimental_telemetry: OpikExporter.getSettings({
name: "opik-nodejs-example",
}),
});
console.log(result.text);
await sdk.shutdown(); // Flushes the trace to Opik
}
main().catch(console.error);
`
For Next.js applications, use the framework's built-in OpenTelemetry support:
`typescript
// instrumentation.ts
import { registerOTel } from "@vercel/otel";
import { OpikExporter } from "opik-vercel";
export function register() {
registerOTel({
serviceName: "opik-vercel-ai-nextjs-example",
traceExporter: new OpikExporter(),
});
}
`
Then use the AI SDK with telemetry enabled:
`typescript
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";
const result = await generateText({
model: openai("gpt-4o"),
prompt: "What is love?",
experimental_telemetry: { isEnabled: true },
});
`
You can add custom tags and metadata to all traces generated by the OpikExporter:
`typescript`
const exporter = new OpikExporter({
// Optional: add custom tags to all traces
tags: ["production", "gpt-4o"],
// Optional: add custom metadata to all traces
metadata: {
environment: "production",
version: "1.0.0",
team: "ai-team",
},
// Optional: associate traces with a conversation thread
threadId: "conversation-123",
});
Tags are useful for filtering and grouping traces, while metadata adds additional context for debugging and analysis. The threadId parameter is useful for tracking multi-turn conversations or grouping related AI interactions.
Use OpikExporter.getSettings() to configure telemetry for individual AI SDK calls:
`typescript`
const result = await generateText({
model: openai("gpt-4o"),
prompt: "Tell a joke",
experimental_telemetry: OpikExporter.getSettings({
name: "custom-trace-name",
// Optional: set threadId per request (overrides exporter-level threadId)
metadata: {
threadId: "conversation-456",
},
}),
});
Or use the basic telemetry settings:
`typescript`
const result = await generateText({
model: openai("gpt-4o"),
prompt: "Tell a joke",
experimental_telemetry: { isEnabled: true },
});
To view your traces:
1. Sign in to your Comet account
2. Navigate to the Opik section
3. Select your project to view all traces
4. Click on a specific trace to see the detailed execution flow
To enable more verbose logging for troubleshooting:
`bash``
OPIK_LOG_LEVEL=DEBUG
- Opik Vercel AI SDK Integration Guide
- Opik Documentation
- Vercel AI SDK Documentation
- Opik TypeScript SDK
Apache 2.0