AI
Blyp can emit structured ai_trace records for AI workloads without forcing you into one SDK.
Supported entrypoints:
@blyp/core/ai/better-agentfor Better Agent app-level plugins and manual run trackers@blyp/core/ai/vercelfor Vercel AI SDK middleware and model wrappers@blyp/core/ai/openaifor direct OpenAI SDK instrumentation@blyp/core/ai/anthropicfor direct Anthropic SDK instrumentation@blyp/core/ai/fetchfor low-level transport tracing aroundfetch
What gets emitted
Each traced AI call produces one normalized ai_trace record with:
providermodeloperation- token usage when the provider returns it
finishReason- timing data
msToFirstChunkfor streamed responses- best-effort tool-call records and trace events
The goal is to make AI calls queryable with the same Blyp primitives used for request logs, structured logs, connectors, and database records.
Request-scoped correlation
When an active request-scoped logger exists, AI traces inherit it automatically. That means the same request can carry:
- the server request log
- AI trace records
- structured logs
- connector-forwarded logs
- database-persisted rows
See Request Tracing for the x-blyp-trace-id lifecycle and how browser logs join the same trace.
Covered SDK paths
| Entry | Covered calls |
|---|---|
@blyp/core/ai/better-agent | Better Agent run-level tracing through app plugins or manual tracker wiring |
@blyp/core/ai/vercel | Vercel AI SDK generateText() and streamText() |
@blyp/core/ai/openai | OpenAI responses.create() and chat.completions.create() |
@blyp/core/ai/anthropic | Anthropic messages.create() |
@blyp/core/ai/fetch | generic transport tracing for status, latency, request IDs, and optional JSON body inspection |
OpenRouter is supported through the OpenAI-compatible path by pointing the OpenAI client at the OpenRouter base URL.