Skip to content

Commit

Permalink
Python: Rename skills to plugins. Update prompt template config to us…
Browse files Browse the repository at this point in the history
…e execution settings to align with dotnet. (microsoft#4595)

### Motivation and Context

To better align with SK dotnet v1, Python skills have been renamed to
plugins. This removes the root samples skills directory as dotnet
supports the plugins and their configs. This the beginning of more work
to consolidate other files/names, per the backlog. I've run several
greps for files or file names containing "skill" (case insensitive), and
no results appear. Do alert if you find anything else that includes
"skill."

Fixes microsoft#3319 

<!-- Thank you for your contribution to the semantic-kernel repo!
Please help reviewers and future users, providing the following
information:
  1. Why is this change required?
  2. What problem does it solve?
  3. What scenario does it contribute to?
  4. If it fixes an open issue, please link to the issue here.
-->

### Description

This is a large, breaking change PR as the replacement of "skills" to
"plugins" affects many files. It was difficult to break this up into
smaller chunks without leaving the SK Python repo in a bad state for
some time. Other updates include:

- Prompt template has been updated to use `execution_settings`, which
aligns with dotnet, from the previous `completion` that Python used.
- AzureOpenAI/OpenAI function calling has been updated to match the
current versions of tools/tool_choice. Parsing messages in the open_ai
utils respects this as it now looks for tool_calls and it configures the
function_call with the id (new), name, and argument).
- File renames, where the filename used to contain "skill" will look
like new code, but it's the same code, just with "plugin" in the name.
- Kernel examples and notebooks have been tested and are working. Unit
tests and integration tests are passing.

<!-- Describe your changes, the overall approach, the underlying design.
These notes will help understanding how your code works. Thanks! -->

### Contribution Checklist

<!-- Before submitting this PR, please make sure: -->

- [X] The code builds clean without any errors or warnings
- [X] The PR follows the [SK Contribution
Guidelines](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md)
and the [pre-submission formatting
script](https://github.com/microsoft/semantic-kernel/blob/main/CONTRIBUTING.md#development-scripts)
raises no violations
- [X] All unit tests pass, and I have added new tests where possible
- [X] I didn't break anyone 😄

---------

Co-authored-by: Evan Mattson <evan.mattson@microsoft.com>
  • Loading branch information
moonbox3 and moonbox3 authored Jan 19, 2024
1 parent f357b45 commit 674af54
Show file tree
Hide file tree
Showing 153 changed files with 2,579 additions and 2,452 deletions.
2 changes: 1 addition & 1 deletion python/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,7 @@ Python notebooks:
- [Using Context Variables to Build a Chat Experience](./notebooks/04-context-variables-chat.ipynb)
- [Introduction to planners](./notebooks/05-using-the-planner.ipynb)
- [Building Memory with Embeddings](./notebooks/06-memory-and-embeddings.ipynb)
- [Using Hugging Face for Skills](./notebooks/07-hugging-face-for-skills.ipynb)
- [Using Hugging Face for Plugins](./notebooks/07-hugging-face-for-plugins.ipynb)
- [Combining native functions and semantic functions](./notebooks/08-native-function-inline.ipynb)
- [Groundedness Checking with Semantic Kernel](./notebooks/09-groundedness-checking.ipynb)
- [Returning multiple results per prompt](./notebooks/10-multiple-results-per-prompt.ipynb)
Expand Down
12 changes: 7 additions & 5 deletions python/notebooks/00-getting-started.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -58,7 +58,8 @@
"api_key, org_id = sk.openai_settings_from_dot_env()\n",
"\n",
"kernel.add_chat_service(\n",
" \"chat-gpt\", OpenAIChatCompletion(ai_model_id=\"gpt-3.5-turbo-1106\", api_key=api_key, org_id=org_id)\n",
" \"chat-gpt\",\n",
" OpenAIChatCompletion(ai_model_id=\"gpt-3.5-turbo-1106\", api_key=api_key, org_id=org_id),\n",
")"
]
},
Expand Down Expand Up @@ -91,7 +92,8 @@
"deployment, api_key, endpoint = sk.azure_openai_settings_from_dot_env()\n",
"\n",
"kernel.add_chat_service(\n",
" \"chat_completion\", AzureChatCompletion(deployment_name=deployment, endpoint=endpoint, api_key=api_key)\n",
" \"chat_completion\",\n",
" AzureChatCompletion(deployment_name=deployment, endpoint=endpoint, api_key=api_key),\n",
")"
]
},
Expand All @@ -102,7 +104,7 @@
"source": [
"# Run a Semantic Function\n",
"\n",
"**Step 3**: Load a Skill and run a semantic function:"
"**Step 3**: Load a Plugin and run a semantic function:"
]
},
{
Expand All @@ -111,8 +113,8 @@
"metadata": {},
"outputs": [],
"source": [
"skill = kernel.import_semantic_skill_from_directory(\"../../samples/skills\", \"FunSkill\")\n",
"joke_function = skill[\"Joke\"]\n",
"plugin = kernel.import_semantic_plugin_from_directory(\"../../samples/plugins\", \"FunPlugin\")\n",
"joke_function = plugin[\"Joke\"]\n",
"\n",
"print(joke_function(\"time travel to dinosaur age\"))"
]
Expand Down
31 changes: 13 additions & 18 deletions python/notebooks/02-running-prompts-from-file.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -6,16 +6,16 @@
"id": "692e361b",
"metadata": {},
"source": [
"# How to run a semantic skills from file\n",
"Now that you're familiar with Kernel basics, let's see how the kernel allows you to run Semantic Skills and Semantic Functions stored on disk. \n",
"# How to run a semantic plugins from file\n",
"Now that you're familiar with Kernel basics, let's see how the kernel allows you to run Semantic Plugins and Semantic Functions stored on disk. \n",
"\n",
"A Semantic Skill is a collection of Semantic Functions, where each function is defined with natural language that can be provided with a text file. \n",
"A Semantic Plugin is a collection of Semantic Functions, where each function is defined with natural language that can be provided with a text file. \n",
"\n",
"Refer to our [glossary](https://github.com/microsoft/semantic-kernel/blob/main/docs/GLOSSARY.md) for an in-depth guide to the terms.\n",
"\n",
"The repository includes some examples under the [samples](https://github.com/microsoft/semantic-kernel/tree/main/samples) folder.\n",
"\n",
"For instance, [this](../../skills/FunSkill/Joke/skprompt.txt) is the **Joke function** part of the **FunSkill skill**:"
"For instance, [this](../../plugins/FunPlugin/Joke/skprompt.txt) is the **Joke function** part of the **FunPlugin plugin**:"
]
},
{
Expand Down Expand Up @@ -55,7 +55,7 @@
"metadata": {},
"source": [
"\n",
"In the same folder you'll notice a second [config.json](../../skills/FunSkill/Joke/config.json) file. The file is optional, and is used to set some parameters for large language models like Temperature, TopP, Stop Sequences, etc.\n",
"In the same folder you'll notice a second [config.json](../../plugins/FunPlugin/Joke/config.json) file. The file is optional, and is used to set some parameters for large language models like Temperature, TopP, Stop Sequences, etc.\n",
"\n",
"```\n",
"{\n",
Expand Down Expand Up @@ -128,7 +128,7 @@
"id": "fd5ff1f4",
"metadata": {},
"source": [
"Import the skill and all its functions:"
"Import the plugin and all its functions:"
]
},
{
Expand All @@ -138,10 +138,10 @@
"metadata": {},
"outputs": [],
"source": [
"# note: using skills from the samples folder\n",
"skills_directory = \"../../samples/skills\"\n",
"# note: using plugins from the samples folder\n",
"plugins_directory = \"../../samples/plugins\"\n",
"\n",
"funFunctions = kernel.import_semantic_skill_from_directory(skills_directory, \"FunSkill\")\n",
"funFunctions = kernel.import_semantic_plugin_from_directory(plugins_directory, \"FunPlugin\")\n",
"\n",
"jokeFunction = funFunctions[\"Joke\"]"
]
Expand All @@ -152,7 +152,7 @@
"id": "edd99fa0",
"metadata": {},
"source": [
"How to use the skill functions, e.g. generate a joke about \"*time travel to dinosaur age*\":"
"How to use the plugin functions, e.g. generate a joke about \"*time travel to dinosaur age*\":"
]
},
{
Expand All @@ -162,13 +162,8 @@
"metadata": {},
"outputs": [],
"source": [
"result = jokeFunction(\"time travel to dinosaur age\")\n",
"\n",
"print(result)\n",
"\n",
"# You can also invoke functions asynchronously\n",
"# result = await jokeFunction.invoke_async(\"time travel to dinosaur age\")\n",
"# print(result)"
"result = await jokeFunction.invoke_async(\"time travel to dinosaur age\")\n",
"print(result)"
]
},
{
Expand All @@ -177,7 +172,7 @@
"id": "2281a1fc",
"metadata": {},
"source": [
"Great, now that you know how to load a skill from disk, let's show how you can [create and run a semantic function inline.](./03-semantic-function-inline.ipynb)"
"Great, now that you know how to load a plugin from disk, let's show how you can [create and run a semantic function inline.](./03-semantic-function-inline.ipynb)"
]
}
],
Expand Down
8 changes: 4 additions & 4 deletions python/notebooks/03-semantic-function-inline.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@
" kernel.add_text_completion_service(\"dv\", azure_text_service)\n",
"else:\n",
" api_key, org_id = sk.openai_settings_from_dot_env()\n",
" oai_text_service = OpenAITextCompletion(ai_model_id=\"text-davinci-003\", api_key=api_key, org_id=org_id)\n",
" oai_text_service = OpenAITextCompletion(ai_model_id=\"gpt-3.5-turbo-instruct\", api_key=api_key, org_id=org_id)\n",
" kernel.add_text_completion_service(\"dv\", oai_text_service)"
]
},
Expand Down Expand Up @@ -170,7 +170,7 @@
"id": "1c2c1262",
"metadata": {},
"source": [
"# Using ChatCompletion for Semantic Skills"
"# Using ChatCompletion for Semantic Plugins"
]
},
{
Expand All @@ -179,7 +179,7 @@
"id": "29b59b28",
"metadata": {},
"source": [
"You can also use chat completion models (like `gpt-35-turbo` and `gpt4`) for creating skills. Normally you would have to tweak the API to accommodate for a system and user role, but SK abstracts that away for you by using `kernel.add_chat_service` and `AzureChatCompletion` or `OpenAIChatCompletion`"
"You can also use chat completion models (like `gpt-35-turbo` and `gpt4`) for creating plugins. Normally you would have to tweak the API to accommodate for a system and user role, but SK abstracts that away for you by using `kernel.add_chat_service` and `AzureChatCompletion` or `OpenAIChatCompletion`"
]
},
{
Expand Down Expand Up @@ -271,7 +271,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.6"
"version": "3.10.12"
}
},
"nbformat": 4,
Expand Down
2 changes: 1 addition & 1 deletion python/notebooks/04-context-variables-chat.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -279,7 +279,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.6"
"version": "3.10.12"
}
},
"nbformat": 4,
Expand Down
Loading

0 comments on commit 674af54

Please sign in to comment.