Vercel AI SDK - Observability & Analytics
Telemetry is an experimental feature of the AI SDK and might change in the future.
The Vercel AI SDK (opens in a new tab) is the TypeScript toolkit designed to help developers build AI-powered applications with React, Next.js, Vue, Svelte, Node.js, and more.
The SDK supports tracing via OpenTelemetry. With the LangfuseExporter
you can collect these traces in Langfuse.
Example Trace in Langfuse
Get Started
You need to be on "ai": "^3.3.0"
to use the telemetry feature as it was
recently added. In case of any issues, please update to the latest version as
this feature is under active development.
Enable Telemetry
While telemetry is experimental (docs (opens in a new tab)), you can enable it by setting experimental_telemetry
on each request that you want to trace.
const result = await generateText({
model: openai("gpt-4-turbo"),
prompt: "Write a short story about a cat.",
experimental_telemetry: { isEnabled: true },
});
Collect Traces With LangfuseExporter
To collect the traces in Langfuse, you need to add the LangfuseExporter
to your application.
You can set the Langfuse credentials via environment variables or directly to the LangfuseExporter
constructor. Create a project in the Langfuse dashboard to get your secretKey
and publicKey
.
LANGFUSE_SECRET_KEY="sk-lf-..."
LANGFUSE_PUBLIC_KEY="pk-lf-..."
LANGFUSE_BASEURL="https://cloud.langfuse.com" # 🇪🇺 EU region
# LANGFUSE_BASEURL="https://us.cloud.langfuse.com" # 🇺🇸 US region
Now you need to resister this exporter via the OpenTelemetry SDK.
NextJS has experimental support for OpenTelemetry instrumentation on the framework level. Learn more about it in the Next.js OpenTelemetry guide (opens in a new tab).
Install dependencies:
npm install @vercel/otel langfuse-vercel @opentelemetry/api-logs @opentelemetry/instrumentation @opentelemetry/sdk-logs
Enable the instrumentationHook
in your next.config.js
:
/** @type {import('next').NextConfig} */
const nextConfig = {
experimental: {
instrumentationHook: true,
},
};
module.exports = nextConfig;
Add LangfuseExporter
to your instrumentation:
import { registerOTel } from "@vercel/otel";
import { LangfuseExporter } from "langfuse-vercel";
export function register() {
registerOTel({
serviceName: "langfuse-vercel-ai-nextjs-example",
traceExporter: new LangfuseExporter(),
});
}
Done! All traces that contain AI SDK spans are automatically captured in Langfuse.
Example Application
We created a sample repository (langfuse/langfuse-vercel-ai-nextjs-example (opens in a new tab)) based on the next-openai (opens in a new tab) template to showcase the integration of Langfuse with Next.js and Vercel AI SDK.
Customization
Disable Tracking of Input/Output
By default, the exporter captures the input and output of each request. You can disable this behavior by setting the recordInputs
and recordOutputs
options to false
.
Pass Custom Attributes
All of the metadata
fields are automatically captured by the exporter. You can also pass custom trace attributes to e.g. track users or sessions.
const result = await generateText({
model: openai("gpt-4-turbo"),
prompt: "Write a short story about a cat.",
experimental_telemetry: {
isEnabled: true,
functionId: "my-awesome-function", // Trace name
metadata: {
langfuseTraceId: "trace-123", // Langfuse trace
tags: ["story", "cat"], // Custom tags
userId: "user-123", // Langfuse user
sessionId: "session-456", // Langfuse session
foo: "bar", // Any custom attribute recorded in metadata
},
},
});
Debugging
Enable the debug
option to see the logs of the exporter.
new LangfuseExporter({ debug: true });
Troubleshooting
- If you deploy on Vercel, Vercel's OpenTelemetry Collector is only available on Pro and Enterprise Plans (docs (opens in a new tab)).
- You need to be on
"ai": "^3.3.0"
to use the telemetry feature as it was recently added. In case of any issues, please update to the latest version as this feature is under active development.
Learn more
See the telemetry documentation (opens in a new tab) of the Vercel AI SDK for more information.