Robotics integration project repository based on the MIT VNAV (Visual Navigation for Autonomous Vehicles) course.
This repository contains all experimental code and reports for Robotics Integration Group Project I. The content ranges from basic Linux/ROS environment configuration to UAV motion planning and control, as well as the implementation and evaluation of Visual Navigation (VNAV) and SLAM systems.
It is primarily developed based on the MIT 16.485 (Visual Navigation for Autonomous Vehicles) 2023 curriculum.
This project is primarily developed in an Ubuntu 20.04 environment and relies on the following core components:
⚠️ Architecture Warning: Due to simulator compatibility issues, Lab 3 and Lab 4 strictly require an x86_64 architecture. These labs are currently NOT compatible with ARM-based systems (e.g., Apple Silicon).
- OS: Ubuntu 20.04 LTS
- ROS: ROS Noetic Ninjemys
- Languages: C++ 14/17, Python 3.8+
- Build Tools: CMake, Make, Catkin
| Lab | Topic | Description | Link |
|---|---|---|---|
| Lab 1 | Environment Configuration | Basic environment configuration and toolchain familiarization for Linux, C++, Git, and CMake. | Notion |
| Lab 2 | ROS Basics | Installation of ROS 1 (Noetic), node communication, TF coordinate transforms, and basic usage. | Notion |
| Lab 3 | 3D Trajectory Following | Implementation of UAV 3D trajectory following and geometric control algorithms. | MIT Lab3 |
| Lab 4 | Drone Control & Racing | Advanced UAV control strategies and simulation of drone racing through gates. | MIT Lab4 |
| Lab 5 | Visual Tracking | Visual frontend processing, feature extraction, and optical flow tracking (Visual Odometry Frontend). | MIT Lab5 |
| Lab 6 | Visual Positioning | Visual backend optimization, pose estimation, and mapping (Visual Odometry Backend). | MIT Lab6 |
| Lab 7 | Visual SLAM Comparison | Performance evaluation of SLAM systems (ORB-SLAM3 vs. Kimera vs. LDSO). | MIT Lab9 |