Debug faster with OpenTelemetry data in Sentry

Resolve code issues faster by adding your OpenTelemetry traces and logs to Sentry. Connect with the community on Discord.

How it works

Leverage the OpenTelemetry instrumentation you're already using and plug it into Sentry via the SDK. Get to the root of a problem with detailed distributed traces that provide a complete end-to-end view of the request path leading up to an error; and let Sentry automatically categorize performance issues based on your stack.

Sentry performance issue detection with OpenTelemetry

Getting Started is Simple

Install the Sentry Python SDK with OpenTelemetry:

Copied!Click to Copy
pip install --upgrade 'sentry-sdk[opentelemetry-otlp]'

Configure your application:

Copied!Click to Copy
import sentry_sdk
from sentry_sdk.integrations.otlp import OTLPIntegration

sentry_sdk.init(
    dsn="__DSN__",
    # Add data like request headers and IP for users, if applicable;
    # see https://docs.sentry.io/platforms/python/data-management/data-collected/ for more info
    send_default_pii=True,
    integrations=[
        OTLPIntegration(),
    ],
)
Copied!Click to Copy
npm install @sentry/node

Create an instrument.js file:

Copied!Click to Copy
const Sentry = require("@sentry/node");

// Ensure to call this before requiring any other modules!
Sentry.init({
  dsn: "__DSN__",

  // Set tracesSampleRate to 1.0 to capture 100%
  // of transactions for tracing.
  // We recommend adjusting this value in production
  tracesSampleRate: 1.0,
});

Require the instrument file before other modules:

Copied!Click to Copy
// Require this first!
require("./instrument");

// Now require other modules
const http = require("http");

// Your application code goes here
Copied!Click to Copy
go get github.com/getsentry/sentry-go \
       github.com/getsentry/sentry-go/otel \
       github.com/getsentry/sentry-go/otel/otlp

Create a trace exporter and register Sentry's error-linking integration:

Copied!Click to Copy
import (
	"context"
	"go.opentelemetry.io/otel"
	sdktrace "go.opentelemetry.io/otel/sdk/trace"

	"github.com/getsentry/sentry-go"
	sentryotel "github.com/getsentry/sentry-go/otel"
	sentryotlp "github.com/getsentry/sentry-go/otel/otlp"
	// ...
)

sentry.Init(sentry.ClientOptions{
	Dsn:              "___PUBLIC_DSN___",
	EnableTracing:    true,
	TracesSampleRate: 1.0,
	Debug:            true,
	Integrations: func(integrations []sentry.Integration) []sentry.Integration {
		return append(integrations, sentryotel.NewErrorLinkingIntegration())
	},
})

ctx := context.Background()
exporter, err := sentryotlp.NewTraceExporter(ctx, "___PUBLIC_DSN___")
if err != nil {
	panic(err)
}

tp := sdktrace.NewTracerProvider(
	sdktrace.WithBatcher(exporter),
)
otel.SetTracerProvider(tp)

Download from: GitHub Releases

Run with the Java agent:

Copied!Click to Copy
java -javaagent:sentry-opentelemetry-agent-<version>-all.jar \
     -Dotel.resource.attributes=service.name=<your-service-name> \
     -jar your-application.jar

Configure sentry.properties:

Copied!Click to Copy
dsn=__DSN__
traces-sample-rate=1.0

Add to your Gemfile:

Copied!Click to Copy
gem "sentry-ruby"
gem "sentry-rails"
gem "sentry-opentelemetry"

gem "opentelemetry-sdk"
gem "opentelemetry-instrumentation-all"

Configure Sentry with OTel:

Copied!Click to Copy
require 'sentry-ruby'
require 'sentry-opentelemetry'

Sentry.init do |config|
  config.dsn = '__DSN__'
  config.traces_sample_rate = 1.0
  config.instrumenter = :otel
end

# Set up OpenTelemetry
OpenTelemetry::SDK.configure do |c|
  c.use_all
  c.add_span_processor(
    Sentry::OpenTelemetry::SpanProcessor.new
  )
end
Copied!Click to Copy
dotnet add package Sentry
dotnet add package Sentry.OpenTelemetry
dotnet add package OpenTelemetry
dotnet add package OpenTelemetry.Instrumentation.AspNetCore

Configure Sentry with OTel:

Copied!Click to Copy
using Sentry;
using OpenTelemetry;
using OpenTelemetry.Trace;

var builder = WebApplication.CreateBuilder(args);

// Configure Sentry
builder.WebHost.UseSentry(options =>
{
    options.Dsn = "__DSN__";
    options.TracesSampleRate = 1.0;
});

// Configure OpenTelemetry
builder.Services.AddOpenTelemetry()
    .WithTracing(tracerProviderBuilder =>
    {
        tracerProviderBuilder
            .AddAspNetCoreInstrumentation()
            .AddSentry();
    });

var app = builder.Build();
app.Run();

More than 150K Organizations Trust Sentry with Their Application Monitoring

What you can do with Sentry and OpenTelemetry

See the full story behind every error

  • View the full sequence of events leading up to each error—including SQL queries, network requests, and debug logs
  • Search through structured logs with automatic trace correlation—no more grepping through files
Learn more about Issue Details

Trace issues across your entire stack

  • Get a unified view from frontend to backend in a single trace
  • Find root causes faster by following slow-loading pages all the way back to poor-performing API calls, and surface any related errors
Learn more about Trace View

Monitor traces and get alerted on important changes

  • Get notified on what matters to you, like when endpoint latency exceeds acceptable thresholds or regresses from baseline
  • Filter down to the most important traces based on attributes like endpoint, method, or status code
  • Review insights for backend performance and errors, and create custom dashboards
Learn more about Alerts

"Getting started with Sentry and OpenTelemetry was fast and easy. We chose Sentry because we can understand why and where something is slow, fix it quickly, and get ahead of user complaints."

Dominik Sandjaja
Senior Software Engineer, bex technologies GmbH

FAQs

The collector approach ultimately removes the Sentry SDKs from the equation, which means you will not get the benefits of advanced error tracking, or any other features Sentry SDKs provide. That includes new product features like profiling and crons. So the approach we decided upon was to treat OTel the same way we treat any other framework. The SDKs can plugin and capture all the relevant data to send to Sentry where we can couple it with all the other features, while getting all insights from your OTel data.

The parent-most OTel span on the OTel span tree that's created becomes the transaction, and the rest of the spans on the OTel span tree (descendants, branches/leaves) are the spans inside of it.

Yes. Our goal is a flawless conversion of OTel data into Sentry. We do not do some things like automatically set tags. But if you see something amiss, reach out to support, or create an issue for SDK team on one of our repos, we will make sure it is routed to the right place.

A trace represents the record of the entire operation you want to measure or track - like page load, an instance of a user completing some action in your application, or a cron job in your backend. When a trace includes work in multiple services, such as those listed above, it's called a distributed trace, because the trace is distributed across those services.
Each trace consists of one or more tree-like structures called spans, a named, timed operation that represents a part of the application workflow. The data captured for each span provides granular insights into specific tasks, like API requests or database queries. Learn more about distributed tracing.

Looking for an example suite of microservices already instrumented with Sentry and OTel SDKs? We got you covered. Check out our demo repo in Github here. This is a fork of the official example provided by OpenTelemetry we use for our own testing purposed.