Skip to content

ValidationError from InputTokensDetails when using LitellmModel with None` cached\_tokens #760

Open
@DanielHashmi

Description

@DanielHashmi

Please read this first

  • Have you read the docs?Agents SDK docs
  • Yes
  • Have you searched for related issues? Others may have faced similar issues.
  • Yes

Describe the bug

When running the sample agent code using LitellmModel, the following validation error occurs during runtime:

pydantic_core._pydantic_core.ValidationError: 1 validation error for InputTokensDetails
cached_tokens
  Input should be a valid integer [type=int_type, input_value=None, input_type=NoneType]
    For further information visit https://errors.pydantic.dev/2.11/v/int_type

Debug information

  • Agents SDK version: (e.g. v0.0.16)
  • Python version (e.g. Python 3.12.1)

Repro steps

  1. Use the following minimal code:
from __future__ import annotations
import asyncio
from agents import Agent, Runner, function_tool
from agents.extensions.models.litellm_model import LitellmModel

@function_tool
def get_weather(city: str):
    print(f"[debug] getting weather for {city}")
    return f"The weather in {city} is sunny."

async def main(model: str, api_key: str):
    agent = Agent(
        name="Assistant",
        instructions="You only respond in haikus.",
        model=LitellmModel(model=model, api_key=api_key),
        tools=[get_weather],
    )

    result = await Runner.run(agent, "What's the weather in Tokyo?")
    print(result.final_output)

if __name__ == "__main__":
    import argparse
    parser = argparse.ArgumentParser()
    parser.add_argument("--model", type=str, required=False)
    parser.add_argument("--api-key", type=str, required=False)
    args = parser.parse_args()

    model = args.model or input("Enter a model name for Litellm: ")
    api_key = args.api_key or input("Enter an API key for Litellm: ")

    asyncio.run(main(model, api_key))
  1. Run the script and provide valid inputs for model and API key.
  2. Observe the traceback error.

Screenshots

Image

Expected behavior

[debug] getting weather for Tokyo
The weather in Tokyo is sunny.

Activity

DavidN22

DavidN22 commented on May 27, 2025

@DavidN22

In the litellm_model.py library theirs a line of code like:

cached_tokens=getattr(response_usage.prompt_tokens_details, "cached_tokens", 0 )

you can try changing it to

cached_tokens = int(getattr(response_usage.prompt_tokens_details, "cached_tokens", 0) or 0)

since the structure requires an int but you are returning None and the InputTokensDetails takes an int not None, this is prolly a bug honestly or maybe an older version of litellm, I did run into this as well

DanielHashmi

DanielHashmi commented on May 27, 2025

@DanielHashmi
Author

In the source code of OpenAi Agent SDK, in litellm_model.py file this line is already as you said go check: https://github.com/openai/openai-agents-python/blob/main/src/agents/extensions/models/litellm_model.py

Look for this:
input_tokens_details=InputTokensDetails( cached_tokens=getattr( response_usage.prompt_tokens_details, "cached_tokens", 0 ) or 0 ),

The difference is just that its not type casting it to an int.

They have actually fixed it, but the latest changes are not being installed! maybe because of PyPi version not updated yet!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @DavidN22@DanielHashmi

        Issue actions

          ValidationError` from `InputTokensDetails` when using `LitellmModel` with `None` cached\_tokens · Issue #760 · openai/openai-agents-python