This is the starter code for a workshop built for students looking to learn more about how large language models are integrated into applications programmatically. It leverages the OpenAI SDK for Python and GPT-3.5. The workshop and manuals are intended for participants of all programming skill levels, from no experience to advanced. However, some Python experience will increase the likelihood of participants completing the exercises without heavy guidance.
This is the starter code for the workshop, it does not include completed solutions for the exercises.
There are two manuals included in this repo. They are intended to be distributed to the participants of the workshop.
- setup-manual.pdf - For the participants to setup their environment locally or on Codespaces.
- workshop-manual.pdf - For the participants to follow along with the workshop.
Keep in mind the current manuals are intended for use with Azure OpenAI, but can be easily modified to work with OpenAI directly.
There are four exercises in this lab, each one being a Python API students have to complete. A deployed version of the workshop can be found at https://gray-beach-052df4810.3.azurestaticapps.net. Keep in mind that the deployed version has the completed code, so all exercises are available and functional.
- (POST /exercise1) Send a single message to GPT
- (POST /exercise2) Maintain a conversation history with GPT
- (POST /exercise3) Send a system prompt to GPT before a conversation.
- (GET /exercise4) Develop a prompt to get GPT to translate maze codes into readable instructions
Every time an exercise page is opened, students will have a test area in the center, a Run tests button on the bottom right, and a Back button on the top left.
Back button: Takes students back to the home page Test area: Test Python code by interacting with the test area. Run tests button: Once students are ready to verify their code, they click on Run tests. It will light up green if the backend implementation is correct, and the exercise is completed. Otherwise students can retry as many times as they’d like.
Once students successfully complete an exercise, they can go back to the home page and start the next one whenever they’re ready.
If you are a workshop organizer and are trying to apply this starter code to your own event, you will need to create your own OpenAI resources and encrypt their keys into .env1.gpg files using the GPG command line tool. You will need to fork this repo and update the /env folder with your own encrypted .env files.
The Python starter code expects three keys to be set:
- OPENAI_API_KEY: Your OpenAI API key
- OPENAI_API_BASE: The base URL for the OpenAI API
- OPENAI_API_TYPE (optional): The source of the OpenAI API (e.g. "Azure")
The following command will encrypt your .env file into a .env.gpg file. You will need to include the passphrase for your students as part of the startup commands in the setup manual (see setup-manual.pdf).
gpg --output .env${N}.gpg --cipher-algo AES256 --symmetric .env
If there are many students in your workshop, it is highly recommended to use multiple OpenAI endpoints to avoid rate limiting. You can do this by:
- On Azure: creating multiple Azure OpenAI resources across multiple Azure subscriptions, regions, and model deployments. The /azure-deploy folder has a shell script and Bicep template for automating this.
- On OpenAI: creating multiple OpenAI accounts, or by creating multiple organizations within the same account.
You can then create multiple .env files and encrypt them into .env1.gpg, .env2.gpg, etc. inside of the /env folder. The setup manual should give the correct command for students that will decrypt the correct env file for them.
If there are any issues with or questions about the setup, please reach out to Ralph Rouhana (ralph.rouhana@gmail.com).
The manual contains more detailed instructions for participants to run this code on Codespaces (easier) and locally (more involved).
As a quick overview, to be able to run this app locally, you will need both Docker and Git installed on your machine.
- Docker: link
- Git: If not preinstalled, running
git --version
in the command line will guide you through installing it.
- Docker: link
- Git: If not preinstalled, running
git --version
in the command line will guide you through installing it.
- Clone the app: Run
git clone https://github.com/ralphr123/bam-workshop && cd bam-workshop
- Run the app: In your project directory root, run
export P=<password> N=<integer> && docker-compose up
- Stop the app: CTRL + C in the terminal window where the app is running