Skip to content

Is it possible to use both local tools and mcp server tools? #797

Closed
@NCCYUNSONG

Description

@NCCYUNSONG

In my agent setting, I set both local tools and mcp server tools, and set model_settings=ModelSettings(
temperature=TEMPERATURE, tool_choice="required"), but the tracing showed only local tools were used. Is it possible to use both local tools and mcp server tools?

Activity

rm-openai

rm-openai commented on Jun 2, 2025

@rm-openai
Collaborator

Yes it is. tool_choice=required tells the model it must use some tool, but leaves the specific tool up to the model. You could add more stuff to your prompt to guide the model, or use an input that clearly requires the MCP tool.

Also note that the MCP server tool is called as part of the model, so it will show up in the call to responses API in the tracing UI. (i.e. the row that says POST/v1/responses)

devtalker

devtalker commented on Jun 9, 2025

@devtalker

You need to check whether the message sent when calling the LLM API includes all of your tools. If it does, then there's no issue. Which tool gets selected depends on your prompt and the LLM's decision-making, not on the Agent SDK itself.

NCCYUNSONG

NCCYUNSONG commented on Jun 9, 2025

@NCCYUNSONG
Author

Thanks a lot. I found the reason. The prompt was not strong enough to make it use MCP tools.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionQuestion about using the SDK

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

      Development

      No branches or pull requests

        Participants

        @devtalker@NCCYUNSONG@rm-openai

        Issue actions

          Is it possible to use both local tools and mcp server tools? · Issue #797 · openai/openai-agents-python