Note: OpenAI .NET SDK instrumentation is in development and is not complete. See Available sources and meters section for the list of covered operations.
OpenAI .NET library is instrumented with distributed tracing and metrics using .NET tracing and metrics API and supports OpenTelemetry.
OpenAI .NET instrumentation follows OpenTelemetry Semantic Conventions for Generative AI systems.
The instrumentation is experimental - volume and semantics of the telemetry items may change.
To enable the instrumentation:
-
Set instrumentation feature-flag using one of the following options:
-
set the
OPENAI_EXPERIMENTAL_ENABLE_OPEN_TELEMETRY
environment variable to"true"
-
set the
OpenAI.Experimental.EnableOpenTelemetry
context switch to true in your application code when application is starting and before initializing any OpenAI clients. For example:AppContext.SetSwitch("OpenAI.Experimental.EnableOpenTelemetry", true);
-
-
Enable OpenAI telemetry:
builder.Services.AddOpenTelemetry() .WithTracing(b => { b.AddSource("OpenAI.*") ... .AddOtlpExporter(); }) .WithMetrics(b => { b.AddMeter("OpenAI.*") ... .AddOtlpExporter(); });
Distributed tracing is enabled with
AddSource("OpenAI.*")
which tells OpenTelemetry to listen to all ActivitySources with names starting withOpenAI.*
.Similarly, metrics are configured with
AddMeter("OpenAI.*")
which enables all OpenAI-related Meters.
Consider enabling HTTP client instrumentation to see all HTTP client calls made by your application including those done by the OpenAI SDK. Check out OpenTelemetry documentation for more details.
The following sources and meters are available:
OpenAI.ChatClient
- records traces and metrics forChatClient
operations (except streaming and protocol methods which are not instrumented yet)