Skip to content

Handle FunctionCallOutputResponseItem in streaming response conversion#7307

Merged
stephentoub merged 5 commits intomainfrom
copilot/fix-streaming-response-conversion
Feb 19, 2026
Merged

Handle FunctionCallOutputResponseItem in streaming response conversion#7307
stephentoub merged 5 commits intomainfrom
copilot/fix-streaming-response-conversion

Conversation

Copy link
Contributor

Copilot AI commented Feb 16, 2026

  • Add FunctionCallOutputResponseItem case to streaming conversion with lastRole ??= ChatRole.Assistant
  • Make test realistic: includes function_call before function_call_output
  • Validate all relevant properties on yielded items (Role, CallId, Name, Result, RawRepresentation)
  • Validate ToChatResponse() produces single message with FunctionCallContent, FunctionResultContent, and TextContent
  • Validate usage details
  • All 86 OpenAIResponseClientTests pass across net8.0, net9.0, net10.0
Original prompt

This section details on the original issue you should resolve

<issue_title>[MEAI]: OpenAIResponsesChatClient.GetStreamingResponseAsync does not handle FunctionCallOutputResponseItem</issue_title>
<issue_description>### Description

In streaming mode, with at least a responses client, FunctionCallOutputResponseItem is not converted into FunctionResultContent. However, the non-streaming method converts it correctly.

There is difference between OpenAIResponsesChatClient.GetStreamingResponseAsync and OpenAIResponsesChatClient.GetResponseAsync while converting OpenAI types to MEAI types.

In non-streaming mode, ToChatMessage correctly converts FunctionCallOutputResponseItem to FunctionResultContent (Here)

But the conversion is absent in streaming mode. FromOpenAIStreamingResponseUpdatesAsync does not handle FunctionCallOutputResponseItem(Here), so the result defaults to AIContent.

That means both of those snippets produce different results if there is a tool call output

var res1 = await chatClient
        .GetStreamingResponseAsync(messages)
        .ToChatResponseAsync();

gives:

Image
var res2 = await chatClient.GetResponseAsync(messages);

gives:

Image

Reproduction Steps

Here is a minimal repro: https://github.com/vsantele/dotnet-extensions-bug-repro-7306

I discovered this bug while using the Microsoft Agent Framework with the OpenAI Hosting layer. But I don't think MAF is related to this issue.

Expected behavior

If there is a response.output_item.done event with an item function_call_output in a stream response, a FunctionResultContent is present in the response from a IChatClient

Actual behavior

This event is not converted to FunctionResultContent but to a AIContent.

Regression?

I don't know

Known Workarounds

No response

Configuration

dotnet: 10.0.103
Microsoft.Extensions.AI.Abstractions: 10.3.0
Microsoft.Extensions.AI.OpenAI: 10.3.0

Windows: 11

Other information

No response</issue_description>

Comments on the Issue (you are @copilot in this section)


✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.

Microsoft Reviewers: Open in CodeFlow

Add a case for FunctionCallOutputResponseItem in the
StreamingResponseOutputItemDoneUpdate switch in
FromOpenAIStreamingResponseUpdatesAsync, matching the non-streaming
ToChatMessage behavior. This ensures streaming responses correctly
produce FunctionResultContent instead of falling through to the
default generic AIContent case.

Fixes #7306

Co-authored-by: stephentoub <[email protected]>
Copilot AI changed the title [WIP] Fix streaming response handling for FunctionCallOutputResponseItem Handle FunctionCallOutputResponseItem in streaming response conversion Feb 16, 2026
Copilot AI requested a review from stephentoub February 16, 2026 00:39
@github-actions github-actions bot added the area-ai Microsoft.Extensions.AI libraries label Feb 16, 2026
@stephentoub
Copy link
Member

@copilot, if the server sends the FunctionCallContent and FunctionResultContent, it's it possible for the chat client to mark the call content as information only so that the client doesn't try to invoke it?

Copy link
Contributor Author

Copilot AI commented Feb 16, 2026

@copilot, if the server sends the FunctionCallContent and FunctionResultContent, it's it possible for the chat client to mark the call content as information only so that the client doesn't try to inv...

Yes, FunctionCallContent has an InformationalOnly property for exactly this purpose. However, neither path (streaming nor non-streaming) currently sets it — the non-streaming ToChatMessage at line 198-201 also creates FunctionCallContent with the default InformationalOnly = false. So this would be a separate change that applies to both paths. Would you like me to address that here as well, or should it be tracked separately?

@stephentoub
Copy link
Member

Will handle it in a followup

@stephentoub stephentoub marked this pull request as ready for review February 16, 2026 23:18
@stephentoub stephentoub requested a review from a team as a code owner February 16, 2026 23:18
Copilot AI review requested due to automatic review settings February 16, 2026 23:18
@stephentoub stephentoub enabled auto-merge (squash) February 16, 2026 23:19
@stephentoub stephentoub requested a review from jozkee February 16, 2026 23:19
Copy link
Contributor

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

Aligns the OpenAI Responses streaming conversion path with the existing non-streaming conversion by correctly mapping function_call_output items to FunctionResultContent, preventing consumers from seeing inconsistent content types between GetStreamingResponseAsync and GetResponseAsync.

Changes:

  • Add handling for FunctionCallOutputResponseItem in FromOpenAIStreamingResponseUpdatesAsync so response.output_item.done yields FunctionResultContent.
  • Add a unit test that streams a function_call_output item and asserts a FunctionResultContent is produced.

Reviewed changes

Copilot reviewed 2 out of 2 changed files in this pull request and generated 1 comment.

File Description
src/Libraries/Microsoft.Extensions.AI.OpenAI/OpenAIResponsesChatClient.cs Adds streaming conversion case for FunctionCallOutputResponseItem to yield FunctionResultContent.
test/Libraries/Microsoft.Extensions.AI.OpenAI.Tests/OpenAIResponseClientTests.cs Adds a streaming test validating function_call_output becomes FunctionResultContent.

…put, validate ToChatResponse, use null coalescing for lastRole

Co-authored-by: stephentoub <[email protected]>
auto-merge was automatically disabled February 16, 2026 23:37

Head branch was pushed to by a user without write access

Copilot AI requested a review from stephentoub February 16, 2026 23:40
@stephentoub stephentoub enabled auto-merge (squash) February 19, 2026 20:50
@stephentoub stephentoub merged commit ac9052d into main Feb 19, 2026
6 checks passed
@stephentoub stephentoub deleted the copilot/fix-streaming-response-conversion branch February 19, 2026 22:00
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

area-ai Microsoft.Extensions.AI libraries

Projects

None yet

Development

Successfully merging this pull request may close these issues.

[MEAI]: OpenAIResponsesChatClient.GetStreamingResponseAsync does not handle FunctionCallOutputResponseItem

4 participants