Skip to content

LiteLLM with Gemini token issue #734

Closed
@handrew

Description

@handrew
Contributor

Using Gemini with LiteLLM - I get the following error

  File "/Users/handrew/env/lib/python3.11/site-packages/agents/extensions/models/litellm_model.py", line 111, in get_response
    input_tokens_details=InputTokensDetails(
                         ^^^^^^^^^^^^^^^^^^^
  File "/Users/handrew/env/lib/python3.11/site-packages/pydantic/main.py", line 253, in __init__
    validated_self = self.__pydantic_validator__.validate_python(data, self_instance=self)
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
pydantic_core._pydantic_core.ValidationError: 1 validation error for InputTokensDetails
cached_tokens
  Input should be a valid integer [type=int_type, input_value=None, input_type=NoneType]

Issue is that getattr will return None if the attribute exists but is literally None. Just have to add "or 0".

I will write a PR.

Activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @handrew@rm-openai

        Issue actions

          LiteLLM with Gemini token issue · Issue #734 · openai/openai-agents-python