This project proposes the Smart Engagement Room, a novel classroom environment that leverages Ambient Intelligence (AmI) to create a more dynamic and interactive learning experience.
The Smart Engagement Room addresses limitations of traditional classrooms by:
- Automated Attendance: Replacing manual attendance checks with a real-time computer vision system.
- Dynamic Lighting: Adjusting lighting based on student location for energy efficiency and learning environment optimization.
- Posture & Gesture Detection: Utilizing AmI to detect raised hands and student posture, allowing instructors to identify student inquiries and tailor their teaching approach.
By integrating AmI, the Smart Engagement Room fosters a more interactive and engaging learning environment for both students and instructors.
The Website can be accessed here: Smart Engagement Room.
To authenticate in the website you can use the following credentials:
Username: grupo6
Password: ami
It is composed by 4 Webpages:
- Home | Dashboard - Page with the statistics to be shown to the professor after a lecture.
- Attendance - Page with the attedances or absences of students.
- Classroom Layout - Page with the representation of the classroom layout and with reactive lights and hands gesture detection.
- Admin - This page is used for development and presentation purposes. In real world it should not be visible.
The Website is hosted online in the Vercel Platform. It communicates with a database hosted in a Mongo Atlas Clust, that stores the statistics to be shown in the website.
To run the website in your local machine you should first install NodeJS and npmJS.
To run the website:
npm run startThe database connection is already setup with no interventation needed.
The computer vision repository handles the grpc connection between the raspberry pi and the computer that handles and calculates all the data from the frames sent by the raspberry pi.
The raspberry pi will be the grpc client (ambint_client.py) that captures images from the camera and sends to the computer (ambint_server.py)
-
Reinstall and install packages
pip uninstall requests_toolbelt urllib3 pip install -r requirements.txt
-
Install the head and hand models with the instructions in
modelsfolder -
Install and start CompreFace in docker locally (follow the tutorial available in their github repo)
- When the CompreFace is running, you will need to create an account, create a project, create the subjects and paste the api key inside
ambint_compreface.py
- When the CompreFace is running, you will need to create an account, create a project, create the subjects and paste the api key inside
-
Initialize grpc server
python3 ambint_server.py -
To switch between student identification and head and hand detection, hold the
skey until the grid appears
-
Create a virtual python environment to install the necessary libraries
python -m venv ambint-env
-
Source to created environment
source ambint-env/bin/activate
-
Install the libraries necessary via pip (
opencv-pythonandgrpcio-tools)pip install opencv-python grpcio-tools
-
Initialize grpc client (make sure that you have the webcam connected to the rasberry pi)
python3 ambint_client.py
Before you begin:
- Install the required libraries on your Raspberry Pi:
sudo apt install mosquitto mosquitto-clientsTest the installation by running the following command:
mosquitto -vThen you should edit the mosquitto configuration file:
sudo nano /etc/mosquitto/mosquitto.confAdd the following lines:
allow_anonymous true
listener 1883Then restart Mosquitto for the changes to take effect:
sudo systemctl restart mosquittoOn your arduino, make sure you have the following libraries installed:
- WiFi.h
- PubSubClient.h
Board compatibility:
- Arduino Nano ESP32
The arduino and the other components (RCWL-0516 and wires) should be assemble as the image below:
Then before doing changes in the code for the MQTT communication you should check for the ip address of the raspberry. For that, use the following command:
hostname -IThen you should do the following changes in the code:
- Of Arduino (arduino-mqtt.ino):
const char* ssid = "<YOUR_WIFI_SSIS>";
const char* passwor = "<YOUR_WIFI_PASSWORD>";
const char* mqtt_server = "<RASPBERRYPI_IP>";- Of RaspberryPi (subscriber.py):
broker = '<RASPBERRYPI_IP'After these steps, you can test the communication:
- Power on the Arduino and upload the code.
- Run the Python script on the Raspberry Pi.
Additional Notes:
- Replace placeholder values like <YOUR_WIFI_SSID> and <RASPBERRYPI_IP> with your actual credentials and IP address.
