Open
Description
Please read this first
- Have you read the docs?Agents SDK docs
- Yes
- Have you searched for related issues? Others may have faced similar issues.
- Yes
Describe the bug
When running the sample agent code using LitellmModel
, the following validation error occurs during runtime:
pydantic_core._pydantic_core.ValidationError: 1 validation error for InputTokensDetails
cached_tokens
Input should be a valid integer [type=int_type, input_value=None, input_type=NoneType]
For further information visit https://errors.pydantic.dev/2.11/v/int_type
Debug information
- Agents SDK version: (e.g.
v0.0.16
) - Python version (e.g. Python 3.12.1)
Repro steps
- Use the following minimal code:
from __future__ import annotations
import asyncio
from agents import Agent, Runner, function_tool
from agents.extensions.models.litellm_model import LitellmModel
@function_tool
def get_weather(city: str):
print(f"[debug] getting weather for {city}")
return f"The weather in {city} is sunny."
async def main(model: str, api_key: str):
agent = Agent(
name="Assistant",
instructions="You only respond in haikus.",
model=LitellmModel(model=model, api_key=api_key),
tools=[get_weather],
)
result = await Runner.run(agent, "What's the weather in Tokyo?")
print(result.final_output)
if __name__ == "__main__":
import argparse
parser = argparse.ArgumentParser()
parser.add_argument("--model", type=str, required=False)
parser.add_argument("--api-key", type=str, required=False)
args = parser.parse_args()
model = args.model or input("Enter a model name for Litellm: ")
api_key = args.api_key or input("Enter an API key for Litellm: ")
asyncio.run(main(model, api_key))
- Run the script and provide valid inputs for model and API key.
- Observe the traceback error.
Screenshots
Expected behavior
[debug] getting weather for Tokyo
The weather in Tokyo is sunny.
Metadata
Metadata
Assignees
Labels
Type
Projects
Milestone
Relationships
Development
No branches or pull requests
Activity
DavidN22 commentedon May 27, 2025
In the litellm_model.py library theirs a line of code like:
you can try changing it to
since the structure requires an int but you are returning None and the InputTokensDetails takes an int not None, this is prolly a bug honestly or maybe an older version of litellm, I did run into this as well
DanielHashmi commentedon May 27, 2025
In the source code of OpenAi Agent SDK, in litellm_model.py file this line is already as you said go check: https://github.com/openai/openai-agents-python/blob/main/src/agents/extensions/models/litellm_model.py
Look for this:
input_tokens_details=InputTokensDetails( cached_tokens=getattr( response_usage.prompt_tokens_details, "cached_tokens", 0 ) or 0 ),
The difference is just that its not type casting it to an int.
They have actually fixed it, but the latest changes are not being installed! maybe because of PyPi version not updated yet!