This workshop shows how to build a web app using Python and Flask running on Microsoft Azure that takes images from a desktop app, analyses them for faces using AI and stores information about the age and emotion of any faces in the image, along with if anyone is smiling. The web app also serves up a simple page showing the data captured.
This workshop is designed for students, and can be run using the free services available as part of the Azure for students offer.
To complete this workshop, you will need:
-
An Azure account. Sign up for free using Azure for students, or the Azure free account if you are not at an academic institution.
-
If you are using Windows you will need to restart your machine after installing to ensure you can use Python and Pip from the command line.
-
The Python Extension for Visual Studio Code. This can be installed from inside VS Code using the Extensions tab.
-
The Azure App Service Extension for Visual Studio Code. This can be installed from inside VS Code using the Extensions tab.
-
The Azure Cosmos DB Extension for Visual Studio Code. This can be installed from inside VS Code using the Extensions tab.
You will need to restart Visual Studio Code before using this extension.
-
A laptop with a Web Cam, or an external camera.
This workshop works on MacOS and Windows.
The attendee detector is a Python app. It uses a library called OpenCV which is accessible from Python to access the Web Cam and take a picture. Once a picture has been taken, this app will send it to a Web Api built in Python, using a framework called Flask and running in the cloud. This is a special type of web site address that doesn't return a web page, but instead allows you to send and receive data.
The Web Api will take the picture, and analyze it using the Azure Face Api from Azure Cognitive Services. This is an AI service that can recognize faces in images, as well as estimating the age of the face, if the person is smiling amongst other things. The Web Api will use this service to detect the age and emotion of all the faces and if they are smiling. This will then be saved into a database called CosmosDB. This is a document database - instead of storing data in rows and columns in tables, it stored data as documents. These documents contain key/value pairs of data stored in a format called JSON.
The Web Api will then have a web site added. This web site will load all the data from CosmosDB and show it in a simple HTML table, showing all details from all faces captured.
All code in this sample is shown in code blocks like this one:
print('Here is some code')
Ellipses will be used to indicate other code, removed to make new code or the code that is being discussed easier to see. For example a code block like this:
def func():
...
print('end of the suite')
means that the print('end of the suite')
will need to go inside the func
function, but after all existing code in this function
The steps for this workshop are:
- Build an app to take a photo
- Create a Flask Web App
- Deploy the Web App to Azure
- Add a Web Api to accept a photo
- Analyse the photo using AI
- Save the face details to a database
- Call the web Api from the app
- Add a web page to view the results
- Clean up
For reference, you can find the final code for this workshop in the Code folder.
This workshop uses resources that are available in the Azure for Students account as free services. As there are limits on the number of free services you can create, you may want to delete the resources created once you are done. The instructions to do this are in the last step - Clean up.