Skip to content

Commit 8381e50

Browse files
authoredJun 27, 2024
Python: Introduce the new function calling abstraction, FunctionChoiceBehavior (microsoft#6910)
### Motivation and Context The current `FunctionCallBehavior` has allowed us to utilize auto function calling for OpenAI type models. As we proceed to support more AI connectors, that differ from OpenAI models, we need to be able to handle functions for all models that support function calling. <!-- Thank you for your contribution to the semantic-kernel repo! Please help reviewers and future users, providing the following information: 1. Why is this change required? 2. What problem does it solve? 3. What scenario does it contribute to? 4. If it fixes an open issue, please link to the issue here. --> ### Description In this PR: - We introduce a new function calling abstraction called `FunctionChoiceBehavior` which has three types: Auto, Required, and None. - One is able to configure the `function_choice_behavior` along with fully qualified function names (e.g., plugin1.function1), `maximum_auto_invoke_attempts` or `auto_invoke_kernel_functions` in yaml and JSON prompts. We have a new concept example showing how to do this for for yaml and JSON prompts. - If the fully qualified names are specified in the config file, they take precedence over the filters, if specified at a later point. - To make sure this isn't a breaking change, we still handle the previous `FunctionCallBehavior`, but so we can make decisions on the new `FunctionChoiceBehavior` we map the `FunctionCallBehavior` to `FunctionChoiceBehavior`. Each time `FunctionCallBehavior` is updated, `FunctionChoiceBehavior` will be updated, too. - The `_process_tool_call()` method in the `open_ai_chat_completion_base` needs to maintain the argument name as `function_call_behavior` as we cannot introduce a breaking change. - All concept samples have been converted to use `FunctionChoiceBehavior`. - New unit tests have been added for `FunctionChoiceBehavior` and we still currently support `FunctionCallBehavior` tests to make sure we haven't broken the backwards compatibility. - Added `deprecated` typing decorators to classes/methods to alert users that it would be best to transition to `FunctionChoiceBehavior` even though `FunctionCallBehavior` is still supported. - The `FunctionCallingStepwisePlanner` was updated to use the new `FunctionChoiceBehavior`. - Closes microsoft#6496, microsoft#6458 - Addresses microsoft#6626 <!-- Describe your changes, the overall approach, the underlying design. These notes will help understanding how your code works. Thanks! --> ### Contribution Checklist <!-- Before submitting this PR, please make sure: --> - [x] The code builds clean without any errors or warnings - [x] The PR follows the [SK Contribution Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md) and the [pre-submission formatting script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts) raises no violations - [x] All unit tests pass, and I have added new tests where possible - [x] I didn't break anyone 😄
1 parent 4f945a3 commit 8381e50

File tree

41 files changed

+1875
-345
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

41 files changed

+1875
-345
lines changed
 

‎python/samples/concepts/auto_function_calling/azure_python_code_interpreter_function_calling.py

+2-3
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77
from azure.core.exceptions import ClientAuthenticationError
88
from azure.identity import DefaultAzureCredential
99

10-
from semantic_kernel.connectors.ai.function_call_behavior import FunctionCallBehavior
10+
from semantic_kernel.connectors.ai.function_choice_behavior import FunctionChoiceBehavior
1111
from semantic_kernel.connectors.ai.open_ai.prompt_execution_settings.azure_chat_prompt_execution_settings import (
1212
AzureChatPromptExecutionSettings,
1313
)
@@ -69,8 +69,7 @@ async def auth_callback() -> str:
6969

7070
req_settings = AzureChatPromptExecutionSettings(service_id=service_id, tool_choice="auto")
7171

72-
filter = {"excluded_plugins": ["ChatBot"]}
73-
req_settings.function_call_behavior = FunctionCallBehavior.EnableFunctions(auto_invoke=True, filters=filter)
72+
req_settings.function_choice_behavior = FunctionChoiceBehavior.Auto(filters={"excluded_plugins": ["ChatBot"]})
7473

7574
arguments = KernelArguments(settings=req_settings)
7675

‎python/samples/concepts/auto_function_calling/chat_gpt_api_function_calling.py

+36-16
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,6 @@
66
from typing import TYPE_CHECKING
77

88
from semantic_kernel import Kernel
9-
from semantic_kernel.connectors.ai.function_call_behavior import FunctionCallBehavior
109
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion, OpenAIChatPromptExecutionSettings
1110
from semantic_kernel.contents import ChatHistory
1211
from semantic_kernel.contents.chat_message_content import ChatMessageContent
@@ -32,16 +31,17 @@
3231
you will return a full answer to me as soon as possible.
3332
"""
3433

34+
# This concept example shows how to handle both streaming and non-streaming responses
35+
# To toggle the behavior, set the following flag accordingly:
36+
stream = True
37+
3538
kernel = Kernel()
3639

3740
# Note: the underlying gpt-35/gpt-4 model version needs to be at least version 0613 to support tools.
3841
kernel.add_service(OpenAIChatCompletion(service_id="chat"))
3942

4043
plugins_directory = os.path.join(__file__, "../../../../../prompt_template_samples/")
4144
# adding plugins to the kernel
42-
# the joke plugin in the FunPlugins is a semantic plugin and has the function calling disabled.
43-
# kernel.import_plugin_from_prompt_directory("chat", plugins_directory, "FunPlugin")
44-
# the math plugin is a core plugin and has the function calling enabled.
4545
kernel.add_plugin(MathPlugin(), plugin_name="math")
4646
kernel.add_plugin(TimePlugin(), plugin_name="time")
4747

@@ -50,11 +50,29 @@
5050
plugin_name="ChatBot",
5151
function_name="Chat",
5252
)
53-
# enabling or disabling function calling is done by setting the function_call parameter for the completion.
54-
# when the function_call parameter is set to "auto" the model will decide which function to use, if any.
55-
# if you only want to use a specific function, set the name of that function in this parameter,
56-
# the format for that is 'PluginName-FunctionName', (i.e. 'math-Add').
57-
# if the model or api version does not support this you will get an error.
53+
54+
# Enabling or disabling function calling is done by setting the `function_choice_behavior` attribute for the
55+
# prompt execution settings. When the function_call parameter is set to "auto" the model will decide which
56+
# function to use, if any.
57+
#
58+
# There are two ways to define the `function_choice_behavior` parameter:
59+
# 1. Using the type string as `"auto"`, `"required"`, or `"none"`. For example:
60+
# configure `function_choice_behavior="auto"` parameter directly in the execution settings.
61+
# 2. Using the FunctionChoiceBehavior class. For example:
62+
# `function_choice_behavior=FunctionChoiceBehavior.Auto()`.
63+
# Both of these configure the `auto` tool_choice and all of the available plugins/functions
64+
# registered on the kernel. If you want to limit the available plugins/functions, you must
65+
# configure the `filters` dictionary attribute for each type of function choice behavior.
66+
# For example:
67+
#
68+
# from semantic_kernel.connectors.ai.function_choice_behavior import FunctionChoiceBehavior
69+
70+
# function_choice_behavior = FunctionChoiceBehavior.Auto(
71+
# filters={"included_functions": ["time-date", "time-time", "math-Add"]}
72+
# )
73+
#
74+
# The filters attribute allows you to specify either: `included_functions`, `excluded_functions`,
75+
# `included_plugins`, or `excluded_plugins`.
5876

5977
# Note: the number of responses for auto invoking tool calls is limited to 1.
6078
# If configured to be greater than one, this value will be overridden to 1.
@@ -63,9 +81,7 @@
6381
max_tokens=2000,
6482
temperature=0.7,
6583
top_p=0.8,
66-
function_call_behavior=FunctionCallBehavior.EnableFunctions(
67-
auto_invoke=True, filters={"included_plugins": ["math", "time"]}
68-
),
84+
function_choice_behavior="auto",
6985
)
7086

7187
history = ChatHistory()
@@ -93,7 +109,10 @@ def print_tool_calls(message: ChatMessageContent) -> None:
93109
f"tool_call {i} arguments: {function_arguments}"
94110
)
95111
formatted_tool_calls.append(formatted_str)
96-
print("Tool calls:\n" + "\n\n".join(formatted_tool_calls))
112+
if len(formatted_tool_calls) > 0:
113+
print("Tool calls:\n" + "\n\n".join(formatted_tool_calls))
114+
else:
115+
print("The model used its own knowledge and didn't return any tool calls.")
97116

98117

99118
async def handle_streaming(
@@ -110,7 +129,7 @@ async def handle_streaming(
110129
print("Mosscap:> ", end="")
111130
streamed_chunks: list[StreamingChatMessageContent] = []
112131
async for message in response:
113-
if not execution_settings.function_call_behavior.auto_invoke_kernel_functions and isinstance(
132+
if not execution_settings.function_choice_behavior.auto_invoke_kernel_functions and isinstance(
114133
message[0], StreamingChatMessageContent
115134
):
116135
streamed_chunks.append(message[0])
@@ -119,6 +138,8 @@ async def handle_streaming(
119138

120139
if streamed_chunks:
121140
streaming_chat_message = reduce(lambda first, second: first + second, streamed_chunks)
141+
if hasattr(streaming_chat_message, "content"):
142+
print(streaming_chat_message.content)
122143
print("Auto tool calls is disabled, printing returned tool calls...")
123144
print_tool_calls(streaming_chat_message)
124145

@@ -141,7 +162,6 @@ async def chat() -> bool:
141162
arguments["user_input"] = user_input
142163
arguments["chat_history"] = history
143164

144-
stream = True
145165
if stream:
146166
await handle_streaming(kernel, chat_function, arguments=arguments)
147167
else:
@@ -151,7 +171,7 @@ async def chat() -> bool:
151171
# ChatMessageContent with information about the tool calls, which need to be sent
152172
# back to the model to get the final response.
153173
function_calls = [item for item in result.value[-1].items if isinstance(item, FunctionCallContent)]
154-
if not execution_settings.function_call_behavior.auto_invoke_kernel_functions and len(function_calls) > 0:
174+
if not execution_settings.function_choice_behavior.auto_invoke_kernel_functions and len(function_calls) > 0:
155175
print_tool_calls(result.value[0])
156176
return True
157177

0 commit comments

Comments
 (0)