Open
Description
Describe the bug
When using Reasoning with OpenAI's models, and running with store=False
, running models with reasoning doesn't work, due to not including the reasoning.encrypted_content
in the responses API (https://platform.openai.com/docs/api-reference/responses/create#responses-create-include)
When you try to use it, you get this error:
Error getting response: Error code: 404 - {'error': {'message': "Item with id 'rs_68582a79cb90819a9167d188b812af9c0a03470b8e30fdc6' not found. Items are not persisted when `store` is set to false. Try again with `store` set to true, or remove this item from your input.", 'type': 'invalid_request_error', 'param': 'input', 'code': None}}. (request_id: req_b7bdc79b7db5caf9bea5f55ac43b02de)
There is no way to override this behavior, it is hard-coded in the responses wrapper used by the SDK.
Debug information
- Agents SDK version: (e.g.
v0.0.18
)
Repro steps
Create an Agent that uses reasoning, with the following model settings:
agent = Agent(
name="example_agent",
tools=[some_tools...],
model="o3",
model_settings=ModelSettings(
store=False,
reasoning=Reasoning(effort="low", summary="detailed")
),
instructions="""some prompt"""
)
Expected behavior
Expecting the reasoning.encrypted_content
to be passed to the API when the store=False
parameter is set, and the SDK to send the encrypted reasoning message on following steps to be able to continue the conversation with the reasoning content.