Skip to content

[BUG] Responses: Can't propagate a reasoning item from previous interaction's output into next interaction's input (serialization of 'id') #779

@trrwilson

Description

@trrwilson

Describe the bug

This is borderline "feature request: don't serialize effective OAS-readonly id (/status) when using ResponseItem as an input model."

First, REST API description

Items provided by the output of a service response include a service-assigned id and status; these are typically just ignored if they're round-tripped back to the service as input, but there's at least one painful exception in the form of reasoning items:

curl "https://api.openai.com/v1/responses" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-5-mini",
    "input":[
      {"type":"message","role":"user","content":"hello, world!"},
      {
        "id": "rs_089e8b550e4db875016900178c0b04819c98aed44cccd5465d",
        "type": "reasoning",
        "summary": []
      },
      {
        "id": "msg_089e8b550e4db875016900178d4ad0819cbb6b8b3287b6d9e7",
        "type": "message",
        "status": "completed",
        "content": [
          {
            "type": "output_text",
            "annotations": [],
            "logprobs": [],
            "text": "Hello! How can I help you today?"
          }
        ],
        "role": "assistant"
      },
      {"type":"message","role":"user","content":"try that again, but dramatically"}
    ],
    "store":false
  }'

The above, in which the output from the first turn is provided as input to the second, yields an error:

{
  "error": {
    "message": "Item with id 'rs_089e8b550e4db875016900178c0b04819c98aed44cccd5465d' not found. Items are not persisted when `store` is set to false. Try again with `store` set to true, or remove this item from your input.",
    "type": "invalid_request_error",
    "param": "input",
    "code": null
  }
}

This error disappears if id is removed from the reasoning item on the input (even leaving the id on the message, which is different from "store":true...), indicating that there are at least some circumstances where we can't safely round-trip the ID value and assume it'll be ignored.

Now the .NET part

This is intensely problematic for multi-turn Responses use impacted by the applied validation, in this case with reasoning items when store is false -- it effectively blocks convenient round-tripping of items. Here's a complete, standalone repro aligned with the above REST:

// GUIDANCE: Instructions to run this code: https://aka.ms/oai/net/start
#:package OpenAI@2.*
#:property PublishAot=false
#:property NoWarn=OPENAI001

using OpenAI.Responses;

string apiKey = Environment.GetEnvironmentVariable("OPENAI_API_KEY")!;
OpenAIResponseClient client = new("gpt-5-mini", apiKey);

// We'll use store=false to demonstrate
ResponseCreationOptions options = new() { StoredOutputEnabled = false };

// First turn
List<ResponseItem> inputItems = [ResponseItem.CreateUserMessageItem("Hello, world!")];
OpenAIResponse response = await client.CreateResponseAsync(inputItems, options);

// Propagate the output items into the next turn, usually nice and elegant...
inputItems.AddRange(response.OutputItems);

// Add the next user input
inputItems.Add(ResponseItem.CreateUserMessageItem("Say that again, but dramatically"));

// Second turn... uh oh
response = await client.CreateResponseAsync(inputItems, options);

// Unhandled exception. System.ClientModel.ClientResultException: HTTP 404 (invalid_request_error: )
// Parameter: input

// Item with id 'rs_049fefcf835fe106016900197607748194b6c739b92a6940c5' not found. Items are not persisted when `store` is set to false. Try again with `store` set to true, or remove this item from your input.
//    at OpenAI.ClientPipelineExtensions.ProcessMessageAsync(ClientPipeline pipeline, PipelineMessage message, RequestOptions options)

Console.WriteLine(response.GetOutputText());

Suggested fix: don't serialize id (and maybe status)

It's clear this works in at least the targeted case because replacing the simple .AddRange() call with the following fixes the exception:

static ResponseItem GetScrubbedItem(ResponseItem originalItem)
{
    JsonObject itemNode = JsonNode.Parse(ModelReaderWriter.Write(originalItem))!.AsObject()!;
    foreach (string removedKey in new List<string> { "id", "status" })
    {
        itemNode.Remove(removedKey);
    }
    return ModelReaderWriter.Read<ResponseItem>(BinaryData.FromString(itemNode.ToJsonString()))!;
}

// Propagate the output items into the next turn, usually nice and elegant...
foreach (ResponseItem outputItem in response.OutputItems)
{
    inputItems.Add(GetScrubbedItem(outputItem));
}

That's clearly horrible, though, and we'd be much better off if the library just didn't serialize the effectively "read-only" (per OpenAPI definition thereof) properties of the type.

Steps to reproduce

See description for integrated code snippets, including standalone repro. Sorry I didn't factor for the boxes.

Code snippets

OS

Windows (but platform-independent)

.NET version

10.0.100-preview.6.25358.103 (but version-independent)

Library version

2.5.0

Metadata

Metadata

Assignees

Labels

area: responsesThis item is related to Responsesfeature-requestCategory: A new feature or enhancement to an existing feature is being requested.needs-investigationThis issue needs investigation to determine a path forward.

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions