Skip to content

This repository contains a ROS-based autonomous vehicle navigation system that uses LiDAR point clouds for lane detection and vehicle control on Polaris GEM e2 and e4.

License

Notifications You must be signed in to change notification settings

Bryan1203/LiDAR-Based-Lane-Navigation

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

54 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LiDAR-Based Lane Navigation

This repository contains a ROS-based autonomous vehicle navigation system that uses LiDAR point clouds for lane detection and vehicle control on Polaris GEM e2 and e4.

Demo

System Demo

Click the image above to watch our system demonstration video.

System Overview

This system integrates multiple components for autonomous navigation:

Pipeline Diagram

Key components:

  • Pre-processing and filtering of LiDAR point clouds
  • Point Transformer V3 for inference
  • KISS-ICP for online SLAM
  • Frame matching with buffered mapping
  • Lane detection and waypoint generation
  • Vehicle control

Prerequisites

  • ROS (tested on ROS Noetic)
  • Python 3.8+
  • Conda package manager
  • NVIDIA GPU

Installation

  1. Clone the repository and set up the ROS workspace:
cd demo_ws
source devel/setup.bash
  1. Create and activate the Pointcept Conda environment:
conda env create -f pointcept151.yml -n pointcept151
conda activate pointcept151
  1. Install Point Transformer dependencies:
# Install pointops for PTv1 & PTv2 or precise evaluation
cd src/pointcept151/libs/pointops
python setup.py install

# Install Google's sparsehash and pointgroup_ops
conda install -c bioconda google-sparsehash 
cd src/pointcept151/libs/pointgroup_ops
python setup.py install --include_dirs=${CONDA_PREFIX}/include
cd ../..
  1. Download the model weight from hugging face (signal model) https://huggingface.co/bryanchang/PTv3_laneline_segemenation_signal

Usage

Initial Setup

  1. remove build and devel
  2. cd demo_ws
  3. build:
catkin_make
source devel/setup.bash

Launch Sequence

  1. Initialize sensors:
source devel/setup.bash
roslaunch basic_launch sensor_init.launch
  1. Launch visualization tools:
source devel/setup.bash
roslaunch basic_launch visualization.launch
  1. Enable joystick control:
source devel/setup.bash
roslaunch basic_launch dbw_joystick.launch

Navigation Pipeline

  1. Launch point cloud preprocessing:
conda activate pointcept151
python3 src/pointcept151/inference_ros_filter.py
  1. Start sequence matching:
conda activate pointcept151
python3 src/sequence_matching.py
  1. Launch real-time window search:
conda activate pointcept151
python3 src/windowSearch_realtime.py
  1. Start KISS-ICP SLAM:
roslaunch src/kiss-icp/ros/launch/odometry.launch topic:=/ouster/points
  1. Launch inference with near-IR model or signal by default (optional):
python3 src/pointcept151/inference_ros_filter.py model_type:=near_ir 
  1. Launch control module:
python3 src/gem_lidar_tracker_pp_new.py

System Components

1. LiDAR Point Cloud Processing (src/pointcept151/inference_ros_filter.py)

  • Raw point cloud data collection and preprocessing by range, height, and x y

2. Point Transformer V3 Inference (src/pointcept151/inference_ros_filter.py)

  • State-of-the-art deep learning architecture
  • Specialized for point cloud processing
  • Feature extraction and scene understanding
  • Around 300 - 400 ms inference time on RTX A4000
  • Publishes inference result

3. KISS-ICP SLAM (src/kiss-icp)

  • Real-time simultaneous localization and mapping
  • Online odometry estimation for precise positioning
  • Efficient point cloud registration
  • Publishes odometry

4. Frame Matching (src/sequence_matching.py)

  • Maps between Ouster LiDAR frame sequence with the KISS-ICP odometry

5. Lane Detection (src/windowSearch_realtime.py)

  • Maintains a rolling buffer of the LiDAR frames (mapping)
  • Performs DBSCAN and window search for lane line detection
  • Publishes waypoints

6. Vehicle Control (src/gem_lidar_tracker_pp_new.py)

  • Waypoint-based trajectory planning
  • Adaptive steering and speed control (PID)

License

This project is licensed under the MIT License - see the LICENSE file for details.

Acknowledgments

  • Point Transformer V3 implementation team
  • KISS-ICP SLAM system developers
  • ROS community and contributors

About

This repository contains a ROS-based autonomous vehicle navigation system that uses LiDAR point clouds for lane detection and vehicle control on Polaris GEM e2 and e4.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •