This repository contains a ROS-based autonomous vehicle navigation system that uses LiDAR point clouds for lane detection and vehicle control on Polaris GEM e2 and e4.
Click the image above to watch our system demonstration video.
This system integrates multiple components for autonomous navigation:
Key components:
- Pre-processing and filtering of LiDAR point clouds
- Point Transformer V3 for inference
- KISS-ICP for online SLAM
- Frame matching with buffered mapping
- Lane detection and waypoint generation
- Vehicle control
- ROS (tested on ROS Noetic)
- Python 3.8+
- Conda package manager
- NVIDIA GPU
- Clone the repository and set up the ROS workspace:
cd demo_ws
source devel/setup.bash
- Create and activate the Pointcept Conda environment:
conda env create -f pointcept151.yml -n pointcept151
conda activate pointcept151
- Install Point Transformer dependencies:
# Install pointops for PTv1 & PTv2 or precise evaluation
cd src/pointcept151/libs/pointops
python setup.py install
# Install Google's sparsehash and pointgroup_ops
conda install -c bioconda google-sparsehash
cd src/pointcept151/libs/pointgroup_ops
python setup.py install --include_dirs=${CONDA_PREFIX}/include
cd ../..
- Download the model weight from hugging face (signal model) https://huggingface.co/bryanchang/PTv3_laneline_segemenation_signal
- remove build and devel
- cd demo_ws
- build:
catkin_make
source devel/setup.bash
- Initialize sensors:
source devel/setup.bash
roslaunch basic_launch sensor_init.launch
- Launch visualization tools:
source devel/setup.bash
roslaunch basic_launch visualization.launch
- Enable joystick control:
source devel/setup.bash
roslaunch basic_launch dbw_joystick.launch
- Launch point cloud preprocessing:
conda activate pointcept151
python3 src/pointcept151/inference_ros_filter.py
- Start sequence matching:
conda activate pointcept151
python3 src/sequence_matching.py
- Launch real-time window search:
conda activate pointcept151
python3 src/windowSearch_realtime.py
- Start KISS-ICP SLAM:
roslaunch src/kiss-icp/ros/launch/odometry.launch topic:=/ouster/points
- Launch inference with near-IR model or signal by default (optional):
python3 src/pointcept151/inference_ros_filter.py model_type:=near_ir
- Launch control module:
python3 src/gem_lidar_tracker_pp_new.py
- Raw point cloud data collection and preprocessing by range, height, and x y
- State-of-the-art deep learning architecture
- Specialized for point cloud processing
- Feature extraction and scene understanding
- Around 300 - 400 ms inference time on RTX A4000
- Publishes inference result
- Real-time simultaneous localization and mapping
- Online odometry estimation for precise positioning
- Efficient point cloud registration
- Publishes odometry
- Maps between Ouster LiDAR frame sequence with the KISS-ICP odometry
- Maintains a rolling buffer of the LiDAR frames (mapping)
- Performs DBSCAN and window search for lane line detection
- Publishes waypoints
- Waypoint-based trajectory planning
- Adaptive steering and speed control (PID)
This project is licensed under the MIT License - see the LICENSE file for details.
- Point Transformer V3 implementation team
- KISS-ICP SLAM system developers
- ROS community and contributors