In this lab, you’ll learn how to use Azure Prompt Flow to streamline the workflow for prompt engineering and expedite the process of using LLMs to build intelligent apps
By the end of the workshop you will learn how to:
- Chat flow that takes input and produces output while keeping a dialog history.
- Take custom data (in csv file) and convert the data into tokenized embeddings with vector indexes.
- Use the LLM tool to create prompts and the response
- Use the embedding tool to the trained embeddings model to search the vector index
- Use the Python tool to create custom functions to preprocess data or call an API
- Use the Prompt tool to format the output response.
- Python environment (3.8 or higher)
- Familiarity with Jupyter Notebooks
- An Azure subscription
- A GitHub account with access to GitHub Codespaces
Click here to go to the step-by-step tutorial.