-
Notifications
You must be signed in to change notification settings - Fork 351
Open
Labels
Description
Tracer Version(s)
5.63.0, latest
Node.js Version(s)
22.20.0
Bug Report
#5858 adds automatic tracing for Vercel's ai-sdk, but I'm not seeing LLM traces be captured. Other traces are captured just fine so setup of dd-trace is working.
Reproduction Code
NextJS endpoint which uses AI SDK:
import { openai } from '@ai-sdk/openai';
import {
streamText,
type UIMessage,
convertToModelMessages,
} from 'ai';
import { tracer } from 'dd-trace';
tracer.init({
service: 'chat-ui',
logInjection: true,
profiling: true,
plugins: true,
env: 'stage',
version: '1.0.0',
runtimeMetrics: true,
hostname: '<hostname for agent>',
llmobs: {
mlApp: 'chat-app',
},
});
export async function POST(req: Request) {
const { messages }: { messages: UIMessage[] } = await req.json();
const result = streamText({
model: openai('gpt-5'),
messages: convertToModelMessages(messages),
// enabling the following has no effect either:
// experimental_telemetry: { isEnabled: true },
});
return result.toUIMessageStreamResponse();
}I've also tried initializing dd-trace within instrumentation.ts file as described by Next as method of enabling open telemetry:
export async function register() {
if (process.env.NEXT_RUNTIME === 'nodejs') {
const { tracer } = await import('dd-trace');
tracer.init({
service: 'chat-ui',
logInjection: true,
profiling: true,
plugins: true,
env: 'stage',
version: '1.0.0',
runtimeMetrics: true,
hostname: '<hostname for agent>',
llmobs: {
mlApp: 'chat-app',
},
});
tracer.use('next');
}Error Logs
No response
Tracer Config
{
service: 'chat-ui',
logInjection: true,
profiling: true,
plugins: true,
env: 'stage',
version: '1.0.0',
runtimeMetrics: true,
hostname: '<hostname for agent>',
llmobs: {
mlApp: 'chat-app',
},
}
Operating System
Darwin mbp-GR2MG2LHD7 24.6.0 Darwin Kernel Version 24.6.0
Bundling
Next.js