Skip to content

litellm proxy with openai-agents #696

Not planned
Not planned
@sumit-lightringai

Description

@sumit-lightringai

Please read this first

version=0.0.14

Error

66.04 ERROR: Cannot install litellm>=1.67.4.post1, litellm[proxy]==1.67.4.post1, litellm[proxy]==1.67.5, litellm[proxy]==1.67.6, litellm[proxy]==1.68.0, litellm[proxy]==1.68.1, litellm[proxy]==1.68.2, litellm[proxy]==1.69.0, litellm[proxy]==1.69.1, litellm[proxy]==1.69.2 and openai-agents because these package versions have conflicting dependencies. 66.04 66.04 The conflict is caused by: 66.04 openai-agents 0.0.14 depends on mcp<2 and >=1.6.0; python_version >= "3.10" 66.04 litellm[proxy] 1.69.2 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.69.1 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.69.0 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.68.2 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.68.1 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.68.0 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 litellm[proxy] 1.67.6 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04 The user requested litellm>=1.67.4.post1 66.04 litellm[proxy] 1.67.5 depends on litellm 1.67.5 (from https://files.pythonhosted.org/packages/99/dc/e4db6b72347446893cab8599b0a043b8883e3380ce5d86a17d4e71aaffbd/litellm-1.67.5-py3-none-any.whl (from https://pypi.org/simple/litellm/) (requires-python:!=2.7.*,!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,!=3.7.*,>=3.8)) 66.04 litellm[proxy] 1.67.4.post1 depends on mcp==1.5.0; python_version >= "3.10" and extra == "proxy" 66.04

Current versions

openai-agents[litellm]>=0.0.14 litellm[proxy]==1.67.4.post1
fixes?

Activity

sumit-lightringai

sumit-lightringai commented on May 15, 2025

@sumit-lightringai
Author

openai-agents[litellm]>=0.0.7 works with litellm[proxy] or vice versa and that version is not production quality.

rm-openai

rm-openai commented on May 15, 2025

@rm-openai
Collaborator

@sumit-lightringai My first instinct is to see if we can update the mcp version in litellm proxy. thoughts? 1.8.0 adds the new MCP streamable http server

github-actions

github-actions commented on May 23, 2025

@github-actions

This issue is stale because it has been open for 7 days with no activity.

github-actions

github-actions commented on May 26, 2025

@github-actions

This issue was closed because it has been inactive for 3 days since being marked as stale.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionQuestion about using the SDKstale

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @rm-openai@sumit-lightringai

        Issue actions

          litellm proxy with openai-agents · Issue #696 · openai/openai-agents-python