===================================================================================
First check the most recent information (for example on github) before you install Ultralytics
because their was a miner injected in 8.3.41, -42, -45, and -46.
===================================================================================
- Use your custom dataset in Google Colab to re-train a pre-trained YOLOv8 model (free).
- Optimize and compile it on Google Cloud Platform (GCP) with Hailo Dataflow Compiler and Model Zoo (for good optimization use a pay-as-you-go GPU instance).
- Deploy you custom YOLOv8 model on a Raspberry Pi 5 equipped with the Hailo-8L AI kit.
- Familiarity with Python and basic object detection concepts.
- A Google account for access to Colab (model training) and GCP (90-day free trail or pay-as-you-go)
- A Hailo account to download Hailo software
- Raspberry Pi 5 with Hailo-8L AI kit installed. For instructions see How to Set Up Raspberry Pi 5 and Hailo-8L
- Install and follow the basic detection pipeline. For instructions see Hailo RPi5 Basic Pipelines
Quick guide to deploy a yolov8s model re-trained with your images on Raspberry Pi 5-AI kit with hailo-8L accelerator
- Prepare Dataset
- Train YOLOv8s Model on Colab
- Installing Dataflow Compiler with NVIDIA GPU
- Installing and Compiling Using Hailo Model Zoo
- Deploy Model on Raspberry Pi 5
Gather images containing the objects you want to detect. You find tips for best training results here Annotate these images using a tool like CVAT in YOLO1.1 format (bounding boxes and class labels). Or use the hornet3000+ dataset. More information on the YOLOs8n model to detect hornets with a Raspberry Pi4 8GB (without AI-kit) can be found at vespCV.
Use the annotated images from step 1 to train a YOLOv8s model in Google Colab. The training process will generate a best.onnx
file, which represents your trained model.
Do not forget to download the model to your local computer before you stop the Colab notebook!
Install Dataflow Compiler in a VM on GCP. The Hailo Model Zoo (installed and used in step 4) is using the Hailo Dataflow Compiler for parsing, model optimization, emulation and compilation of the deep learning models.
The best.onnx
file from step 3 is parsed and saved as .har
file by the Model Zoo (installed in step 3) and optimized and saved again as .har
file and finally compiled to a .hef file. During this process, your dataset with images (from step 1) is needed to calibrate that data during optimization (and optionally for validation to create evaluation results) to produce yolov8s.hef
(Hailo Executable File).
Finally, you transfer the yolov8s.hef
file to your Raspberry Pi 5. This will involve setting up the Hailo runtime environment and integrating your model into your application.
While Colab offers free GPU resources, the virtual machine (VM) is needed to do the docker install of the Hailo Software Suite. A GPU is not available in the Free Trial.