AI and LLM Observability

Agents, LLMs, vector stores, custom logic—visibility can’t stop at the model call.

Get the context you need to debug failures, optimize performance, and keep AI features reliable.

works with

Vercel AI logo
Vercel AI
OpenAI Agents logo
OpenAI Agents
Node.js logo
Node.js
Next.js logo
Next.js
SvelteKit logo
SvelteKit
Nuxt logo
Nuxt
Astro logo
Astro
Remix logo
Remix
SolidStart logo
SolidStart
Express logo
Express
Fastify logo
Fastify
NestJS logo
NestJS
Hapi logo
Hapi
Koa logo
Koa
Connect logo
Connect
Hono logo
Hono
Bun logo
Bun
AWS Lambda logo
AWS Lambda
Azure Functions logo
Azure Functions
Google Cloud Functions logo
Google Cloud Functions
Electron logo
Electron

Tolerated by 4 million developers

Anthropic logo
Cursor logo
Github logo
Vercel logo
Microsoft logo
CoveoDark logoCVO
FactoryAi logoFactory.ai Logo

Full LLM Observability

See everything on one pane of glass

See everything on one pane of glass

Track all agent runs, error rates, LLM calls, tokens used, and tool executions. Monitor traffic patterns and duration metrics across your AI-powered features.

See Docs

Track Model Costs & Tokens

Monitor spending across models.

Monitor spending across models.

Compare costs across different models. See token usage breakdown by model, track input vs output tokens, and identify expensive operations.

See Docs

Monitor tool execution

Track agent tool calls and errors.

Track agent tool calls and errors.

See which tools your agents call, their error rates, average duration, and P95 latency. Identify slow or failing tool executions before they impact users.

Learn About Tracing

Deep Trace Analysis

Debug with full context.

Debug with full context.

Dive into individual requests with full prompt and response context. See AI spans with agent invocations, tool executions, token counts, costs, and timing.

Learn More

Getting started with Sentry is simple

We support every technology (except the ones we don't).
Get started with just a few lines of code.

Install sentry-sdk from PyPI:

Click to Copy
pip install "sentry-sdk"

Add OpenAIAgentsIntegration() to your integrations list:

Click to Copy
import sentry_sdk from sentry_sdk.integrations.openai_agents import OpenAIAgentsIntegration sentry_sdk.init( # Configure your DSN dsn="https://examplePublicKey@o0.ingest.sentry.io/0", # Add data like inputs and responses to/from LLMs and tools; # see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info send_default_pii=True, integrations=[ OpenAIAgentsIntegration(), ], )

That's it. Check out our documentation to ensure you have the latest instructions.

"Sentry played a significant role in helping us develop [Claude] Sonnet"
Since adopting Sentry, Anthropic has seen:
10-15%

increase in developer productivity

600+

engineers rely on Sentry to ship code

20-30%

faster incident resolution

read more

Fix what's broken with LLM Observability

Get started with the only LLM observability platform that gives developers tools to fix application problems without compromising on velocity.