{
 "cells": [
  {
   "attachments": {},
   "cell_type": "markdown",
   "id": "692e361b",
   "metadata": {},
   "source": [
    "# How to run a prompt plugins from file\n",
    "\n",
    "Now that you're familiar with Kernel basics, let's see how the kernel allows you to run Prompt Plugins and Prompt Functions stored on disk.\n",
    "\n",
    "A Prompt Plugin is a collection of Semantic Functions, where each function is defined with natural language that can be provided with a text file.\n",
    "\n",
    "Refer to our [glossary](https://github.com/microsoft/semantic-kernel/blob/main/docs/GLOSSARY.md) for an in-depth guide to the terms.\n",
    "\n",
    "The repository includes some examples under the [samples](https://github.com/microsoft/semantic-kernel/tree/main/samples) folder.\n",
    "\n",
    "For instance, [this](../../../prompt_template_samples/FunPlugin/Joke/skprompt.txt) is the **Joke function** part of the **FunPlugin plugin**:\n"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "3feecb6e",
   "metadata": {},
   "source": [
    "Import Semantic Kernel SDK from pypi.org"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "32187534",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Note: if using a virtual environment, do not run this cell\n",
    "%pip install -U semantic-kernel\n",
    "from semantic_kernel import __version__\n",
    "\n",
    "__version__"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "cc58d362",
   "metadata": {},
   "source": [
    "Initial configuration for the notebook to run properly."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "bc1bc941",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Make sure paths are correct for the imports\n",
    "\n",
    "import os\n",
    "import sys\n",
    "\n",
    "notebook_dir = os.path.abspath(\"\")\n",
    "parent_dir = os.path.dirname(notebook_dir)\n",
    "grandparent_dir = os.path.dirname(parent_dir)\n",
    "\n",
    "\n",
    "sys.path.append(grandparent_dir)"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "b5074884",
   "metadata": {},
   "source": [
    "### Configuring the Kernel\n",
    "\n",
    "Let's get started with the necessary configuration to run Semantic Kernel. For Notebooks, we require a `.env` file with the proper settings for the model you use. Create a new file named `.env` and place it in this directory. Copy the contents of the `.env.example` file from this directory and paste it into the `.env` file that you just created.\n",
    "\n",
    "**NOTE: Please make sure to include `GLOBAL_LLM_SERVICE` set to either OpenAI, AzureOpenAI, or HuggingFace in your .env file. If this setting is not included, the Service will default to AzureOpenAI.**\n",
    "\n",
    "#### Option 1: using OpenAI\n",
    "\n",
    "Add your [OpenAI Key](https://openai.com/product/) key to your `.env` file (org Id only if you have multiple orgs):\n",
    "\n",
    "```\n",
    "GLOBAL_LLM_SERVICE=\"OpenAI\"\n",
    "OPENAI_API_KEY=\"sk-...\"\n",
    "OPENAI_ORG_ID=\"\"\n",
    "OPENAI_CHAT_MODEL_ID=\"\"\n",
    "OPENAI_TEXT_MODEL_ID=\"\"\n",
    "OPENAI_EMBEDDING_MODEL_ID=\"\"\n",
    "```\n",
    "The names should match the names used in the `.env` file, as shown above.\n",
    "\n",
    "#### Option 2: using Azure OpenAI\n",
    "\n",
    "Add your [Azure Open AI Service key](https://learn.microsoft.com/azure/cognitive-services/openai/quickstart?pivots=programming-language-studio) settings to the `.env` file in the same folder:\n",
    "\n",
    "```\n",
    "GLOBAL_LLM_SERVICE=\"AzureOpenAI\"\n",
    "AZURE_OPENAI_API_KEY=\"...\"\n",
    "AZURE_OPENAI_ENDPOINT=\"https://...\"\n",
    "AZURE_OPENAI_CHAT_DEPLOYMENT_NAME=\"...\"\n",
    "AZURE_OPENAI_TEXT_DEPLOYMENT_NAME=\"...\"\n",
    "AZURE_OPENAI_EMBEDDING_DEPLOYMENT_NAME=\"...\"\n",
    "AZURE_OPENAI_API_VERSION=\"...\"\n",
    "```\n",
    "The names should match the names used in the `.env` file, as shown above.\n",
    "\n",
    "For more advanced configuration, please follow the steps outlined in the [setup guide](./CONFIGURING_THE_KERNEL.md)."
   ]
  },
  {
   "cell_type": "markdown",
   "id": "93d7361e",
   "metadata": {},
   "source": [
    "Let's move on to learning what prompts are and how to write them."
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "id": "f3ce1efe",
   "metadata": {},
   "source": [
    "```\n",
    "WRITE EXACTLY ONE JOKE or HUMOROUS STORY ABOUT THE TOPIC BELOW.\n",
    "JOKE MUST BE:\n",
    "- G RATED\n",
    "- WORKPLACE/FAMILY SAFE\n",
    "NO SEXISM, RACISM OR OTHER BIAS/BIGOTRY.\n",
    "BE CREATIVE AND FUNNY. I WANT TO LAUGH.\n",
    "+++++\n",
    "{{$input}}\n",
    "+++++\n",
    "```\n"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "id": "afdb96d6",
   "metadata": {},
   "source": [
    "Note the special **`{{$input}}`** token, which is a variable that is automatically passed when invoking the function, commonly referred to as a \"function parameter\".\n",
    "\n",
    "We'll explore later how functions can accept multiple variables, as well as invoke other functions.\n"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "id": "c3bd5134",
   "metadata": {},
   "source": [
    "In the same folder you'll notice a second [config.json](../../../prompt_template_samples/FunPlugin/Joke/config.json) file. The file is optional, and is used to set some parameters for large language models like Temperature, TopP, Stop Sequences, etc.\n",
    "\n",
    "```\n",
    "{\n",
    "  \"schema\": 1,\n",
    "  \"description\": \"Generate a funny joke\",\n",
    "  \"execution_settings\": {\n",
    "    \"default\": {\n",
    "      \"max_tokens\": 1000,\n",
    "      \"temperature\": 0.9,\n",
    "      \"top_p\": 0.0,\n",
    "      \"presence_penalty\": 0.0,\n",
    "      \"frequency_penalty\": 0.0\n",
    "    }\n",
    "  },\n",
    "  \"input_variables\": [\n",
    "    {\n",
    "      \"name\": \"input\",\n",
    "      \"description\": \"Joke subject\",\n",
    "      \"default\": \"\"\n",
    "    },\n",
    "    {\n",
    "      \"name\": \"style\",\n",
    "      \"description\": \"Give a hint about the desired joke style\",\n",
    "      \"default\": \"\"\n",
    "    }\n",
    "  ]\n",
    "}\n",
    "\n",
    "```\n"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "id": "384ff07f",
   "metadata": {},
   "source": [
    "Given a prompt function defined by these files, this is how to load and use a file based prompt function.\n",
    "\n",
    "Load and configure the kernel, as usual, loading also the AI service settings defined in the [Setup notebook](00-getting-started.ipynb):\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "9c0688c5",
   "metadata": {},
   "outputs": [],
   "source": [
    "from semantic_kernel import Kernel\n",
    "\n",
    "kernel = Kernel()"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "63f0788e",
   "metadata": {},
   "source": [
    "We will load our settings and get the LLM service to use for the notebook."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "82d16ce6",
   "metadata": {},
   "outputs": [],
   "source": [
    "from services import Service\n",
    "\n",
    "from samples.service_settings import ServiceSettings\n",
    "\n",
    "service_settings = ServiceSettings.create()\n",
    "\n",
    "# Select a service to use for this notebook (available services: OpenAI, AzureOpenAI, HuggingFace)\n",
    "selectedService = (\n",
    "    Service.AzureOpenAI\n",
    "    if service_settings.global_llm_service is None\n",
    "    else Service(service_settings.global_llm_service.lower())\n",
    ")\n",
    "print(f\"Using service type: {selectedService}\")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "04ad7f35",
   "metadata": {},
   "source": [
    "Let's load our settings and validate that the required ones exist."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "fdb865a7",
   "metadata": {},
   "outputs": [],
   "source": [
    "from services import Service\n",
    "\n",
    "from samples.service_settings import ServiceSettings\n",
    "\n",
    "service_settings = ServiceSettings.create()\n",
    "\n",
    "# Select a service to use for this notebook (available services: OpenAI, AzureOpenAI, HuggingFace)\n",
    "selectedService = (\n",
    "    Service.AzureOpenAI\n",
    "    if service_settings.global_llm_service is None\n",
    "    else Service(service_settings.global_llm_service.lower())\n",
    ")\n",
    "print(f\"Using service type: {selectedService}\")"
   ]
  },
  {
   "cell_type": "markdown",
   "id": "c50b4d7a",
   "metadata": {},
   "source": [
    "We now configure our Chat Completion service on the kernel."
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "b0062a24",
   "metadata": {},
   "outputs": [],
   "source": [
    "# Remove all services so that this cell can be re-run without restarting the kernel\n",
    "kernel.remove_all_services()\n",
    "\n",
    "service_id = None\n",
    "if selectedService == Service.OpenAI:\n",
    "    from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion\n",
    "\n",
    "    service_id = \"default\"\n",
    "    kernel.add_service(\n",
    "        OpenAIChatCompletion(\n",
    "            service_id=service_id,\n",
    "        ),\n",
    "    )\n",
    "elif selectedService == Service.AzureOpenAI:\n",
    "    from semantic_kernel.connectors.ai.open_ai import AzureChatCompletion\n",
    "\n",
    "    service_id = \"default\"\n",
    "    kernel.add_service(\n",
    "        AzureChatCompletion(\n",
    "            service_id=service_id,\n",
    "        ),\n",
    "    )"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "id": "fd5ff1f4",
   "metadata": {},
   "source": [
    "Import the plugin and all its functions:\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "56ee184d",
   "metadata": {},
   "outputs": [],
   "source": [
    "# note: using plugins from the samples folder\n",
    "plugins_directory = \"../../../prompt_template_samples/\"\n",
    "\n",
    "funFunctions = kernel.add_plugin(parent_directory=plugins_directory, plugin_name=\"FunPlugin\")\n",
    "\n",
    "jokeFunction = funFunctions[\"Joke\"]"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "id": "edd99fa0",
   "metadata": {},
   "source": [
    "How to use the plugin functions, e.g. generate a joke about \"_time travel to dinosaur age_\":\n"
   ]
  },
  {
   "cell_type": "code",
   "execution_count": null,
   "id": "6effe63b",
   "metadata": {},
   "outputs": [],
   "source": [
    "result = await kernel.invoke(jokeFunction, input=\"travel to dinosaur age\", style=\"silly\")\n",
    "print(result)"
   ]
  },
  {
   "attachments": {},
   "cell_type": "markdown",
   "id": "2281a1fc",
   "metadata": {},
   "source": [
    "Great, now that you know how to load a plugin from disk, let's show how you can [create and run a prompt function inline.](./03-prompt-function-inline.ipynb)\n"
   ]
  }
 ],
 "metadata": {
  "kernelspec": {
   "display_name": "Python 3 (ipykernel)",
   "language": "python",
   "name": "python3"
  },
  "language_info": {
   "codemirror_mode": {
    "name": "ipython",
    "version": 3
   },
   "file_extension": ".py",
   "mimetype": "text/x-python",
   "name": "python",
   "nbconvert_exporter": "python",
   "pygments_lexer": "ipython3",
   "version": "3.12.3"
  }
 },
 "nbformat": 4,
 "nbformat_minor": 5
}