AI and LLM Observability

Agents, LLMs, vector stores, custom logic—visibility can’t stop at the model call.

Get the context you need to debug failures, optimize performance, and keep AI features reliable.

Tolerated by 4 million developers

  • Anthropic
  • Cursor
  • GitHub
  • Vercel
  • Microsoft
  • Bolt
  • Factory AI
  • Cognition
  • Pinecone
  • ElevenLabs
  • Glean
  • Harvey
  • Mistral
  • Replit

Full LLM Observability

See everything on one pane of glass

Track all agent runs, error rates, LLM calls, tokens used, and tool executions. Monitor traffic patterns and duration metrics across your AI-powered features.

See Docs
See everything on one pane of glass

Track Model Costs & Tokens

Monitor spending across models.

Monitor spending across models.

Compare costs across different models. See token usage breakdown by model, track input vs output tokens, and identify expensive operations.

See Docs

Monitor tool execution

Track agent tool calls and errors.

See which tools your agents call, their error rates, average duration, and P95 latency. Identify slow or failing tool executions before they impact users.

Learn About Tracing
Track agent tool calls and errors.

Deep Trace Analysis

Debug with full context.

Debug with full context.

Dive into individual requests with full prompt and response context. See AI spans with agent invocations, tool executions, token counts, costs, and timing.

Learn More

Getting started with Sentry is simple

We support every technology (except the ones we don't).
Get started with just a few lines of code.

Install sentry-sdk from PyPI:

Copied!Click to Copy
pip install "sentry-sdk"

Add OpenAIAgentsIntegration() to your integrations list:

Copied!Click to Copy
import sentry_sdk
from sentry_sdk.integrations.openai_agents import OpenAIAgentsIntegration

sentry_sdk.init(
    # Configure your DSN
    dsn="https://examplePublicKey@o0.ingest.sentry.io/0",
    # Add data like inputs and responses to/from LLMs and tools;
    # see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
    send_default_pii=True,
    integrations=[
        OpenAIAgentsIntegration(),
    ],
)

The vercelAIIntegration adds instrumentation for the ai SDK by Vercel to capture spans using the AI SDK's built-in Telemetry. Get started with the following snippet:

Copied!Click to Copy
Sentry.init({
  // Configure your DSN
  dsn: 'https://<key>@sentry.io/<project>',
  tracesSampleRate: 1.0,
  integrations: [
    Sentry.vercelAIIntegration({
      recordInputs: true,
      recordOutputs: true,
    }),
  ],
});

To correctly capture spans, pass the experimental_telemetry object with isEnabled: true to every generateText, generateObject, and streamText function call.

Copied!Click to Copy
const result = await generateText({
  model: openai("gpt-4o"),
  experimental_telemetry: {
    isEnabled: true,
  },
});
"Sentry played a significant role in helping us develop [Claude] Sonnet"
Company logo
Since adopting Sentry, Anthropic has seen:
10-15%

increase in developer productivity

600+

engineers rely on Sentry to ship code

20-30%

faster incident resolution

read more

Fix what's broken with LLM Observability

Get started with the only LLM observability platform that gives developers tools to fix application problems without compromising on velocity.