Skip to content

.Net: Bug: Telemetry missing when using ChatCompletionAgent.InvokeStreamingAsync with AzureOpenAIChatCompletion #12986

@andrewabest

Description

@andrewabest

Describe the bug
When invoking streaming from a chat completion agent, using a kernel set up with AzureOpenAI, the following trace telemetry is missing:

For chat completion traces:

  1. Finish reason is always N/A
  2. Token usage is not emitted

It also looks like metrics are not emitted for streaming invocations.

To Reproduce
Here is a small sample app that produces the telemetry for inspection:

async Task Main()
{
	var resourceBuilder = ResourceBuilder
    .CreateDefault()
    .AddService("TelemetryConsoleQuickstart");

	// Enable model diagnostics with sensitive data.
	AppContext.SetSwitch("Microsoft.SemanticKernel.Experimental.GenAI.EnableOTelDiagnosticsSensitive", true);
	
	using var traceProvider = Sdk.CreateTracerProviderBuilder()
	    .SetResourceBuilder(resourceBuilder)
	    .AddSource("Microsoft.SemanticKernel*")
	    .AddConsoleExporter()
	    .Build();
	
	using var meterProvider = Sdk.CreateMeterProviderBuilder()
	    .SetResourceBuilder(resourceBuilder)
	    .AddMeter("Microsoft.SemanticKernel*")
	    .AddConsoleExporter()
	    .Build();
	
	using var loggerFactory = LoggerFactory.Create(builder =>
	{
	    // Add OpenTelemetry as a logging provider
	    builder.AddOpenTelemetry(options =>
	    {
	        options.SetResourceBuilder(resourceBuilder);
	        options.AddConsoleExporter();
	        // Format log messages. This is default to false.
	        options.IncludeFormattedMessage = true;
	        options.IncludeScopes = true;
	    });
		builder.SetMinimumLevel(LogLevel.Information);
	});

	IKernelBuilder builder = Kernel.CreateBuilder();
	builder.Services.AddSingleton(loggerFactory);
	
	var openApiUri = "your-azure-deployment-uri";
	var openApiKey = "your-azure-api-key";

	builder.AddAzureOpenAIChatCompletion(
		deploymentName: "o4-mini",
		endpoint: openApiUri,
		apiKey: openApiKey
	);

	Kernel kernel = builder.Build();
	
	await ExecuteChatViaAgent(kernel, loggerFactory);
	
	Console.WriteLine("==============================================================================================");
	Console.WriteLine("Traces below here do not contain finish reason or token usage. They also do not contribute to metrics.");
	Console.WriteLine("");
	
	await ExecuteChatViaAgentStreaming(kernel, loggerFactory);
}

async Task ExecuteChatViaAgent(Kernel kernel, ILoggerFactory loggerFactory)
{
	ChatCompletionAgent agent =
		new()
		{
			Kernel = kernel,
			LoggerFactory = loggerFactory,
		};

	Microsoft.SemanticKernel.ChatMessageContent message = new(AuthorRole.User, "Why is the sky blue? Answer in one sentence.");

	var stream = agent.InvokeAsync(
		message
	);
	
	string answer = "";
	await foreach (var item in stream)
	{
		answer += item.Message.Content;
	}

	Console.WriteLine(answer);
}

async Task ExecuteChatViaAgentStreaming(Kernel kernel, ILoggerFactory loggerFactory)
{
	ChatCompletionAgent agent =
		new()
		{
			Kernel = kernel,
			LoggerFactory = loggerFactory,
		};

	Microsoft.SemanticKernel.ChatMessageContent message = new(AuthorRole.User, "Why is the sky blue? Answer in one sentence.");

	var stream = agent.InvokeStreamingAsync(
		message
	);

	string answer = "";
	await foreach (var item in stream)
	{
		answer += item.Message.Content;
	}

	Console.WriteLine(answer);
}

Requires:

  • Microsoft.SemanticKernel
  • Microsoft.SemanticKernel.Agents.Core
  • Microsoft.SemanticKernel.Connectors.AzureOpenAI
  • OpenTelemetry.Exporter.Console

You can comment out the non-streaming invocation to see the absence of metrics.

Expected behavior
I expect all of the telemetry available in non-streaming invocations to also be available in streaming invocations, including token usage and finish reasons.

Platform

  • Language: C#
  • Source: Nuget Package 1.62.0
  • AI model: tested using o4-mini
  • IDE: Repro in Linqpad
  • OS: Repro on Mac

Metadata

Metadata

Assignees

Labels

.NETIssue or Pull requests regarding .NET codebugSomething isn't workingstaleIssue is stale because it has been open for a while and has no activity

Type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions