Skip to content

Implement an Extended Kalman Filter to track the three dimensional position and orientation of a robot using gyroscope, accelerometer, and camera measurements.

License

Notifications You must be signed in to change notification settings

lmqZach/Visual-Inertial-SLAM

Repository files navigation

Visual-Inertial SLAM

Author: Zach(Muqing) Li


SLAM Overview

This project implements a Visual-Inertial SLAM system using stereo camera, gyroscope, and accelerometer data to perform simultaneous localization and mapping with an Extended Kalman Filter (EKF). The app provides a user-friendly 3D interface for playback and inspection of the SLAM process.

Methodology

Technically, the goal is to implement an EKF prediction step based on SE(3) kinematics with IMU measurements and an EKF update step based on the stereo-camera observation model with feature observations to perform localization and mapping.

Methodology breakdown:

  1. IMU Localization via EKF Prediction: Implement EKF prediction based on SE(3) kinematics with linear and angular velocity measurements to estimate IMU's poses Tt ∈ SE(3) over time t.
  2. Landmark Mapping via EKF Update: Implement EKF with the unknown landmark positions m ∈ R^(3×M) as a state and perform EKF update after every visual observation zt to keep track of mean and covariance of m.
  3. Visual-Inertial SLAM: Combine IMU prediction step from (1), with the landmark update step from (2), and implement IMU update step based on the stereo-camera observation model to complete visual-inertial SLAM algorithm.

File Structure

Visual-Inertial-SLAM/
│
├── app.py                        # Streamlit app for SLAM playback and visualization
├── main.py                       # Core script to run SLAM logic
├── ekf_slam_stepper.py           # EKF SLAM step-by-step processor
├── helpers.py                    # Projection and stereo camera matrix utilities
├── pr3_utils.py                  # Provided utility functions for stereo geometry
├── requirements.txt              # Required Python packages
├── README.md                     # Project overview and execution guide
├── Visual-Inertial-SLAM_Report.pdf   # Technical project report
├── LICENSE                       # License file
│
├── data/                         # Input data files (.npz format)
│   ├── 03.npz
│   └── 10.npz
│
├── figure/                       # Output visualizations
│   ├── 03.png
│   └── 03_visual_inertial.png
│   └── saved/
│       ├── 03.png
│       ├── 03_compare.png
│       ├── 03_visual_inertial.png
│       └── 10.png
│
└── .git/                         # Git metadata (hidden folder)

How to Run

To run the visualization tool:

cd Visual-Inertial-SLAM
streamlit run app.py

To run the baseline SLAM script:

python main.py

SLAM Visualization App Functionality

This application demonstrates a visual-inertial SLAM engine for real-time 3D pose estimation and environmental mapping. It fuses stereo camera data, gyroscope, and accelerometer measurements through an Extended Kalman Filter (EKF) that operates in SE(3) space, allowing for simultaneous motion tracking and world-frame landmark estimation.

The tool features a full 3D visualization of the evolving robot pose and environment map, with both manual step-by-step and automatic playback modes. This serves as a high-fidelity simulation and diagnostic interface for autonomous navigation research.

The Streamlit app provides:

  • Manual Playback: Step through SLAM results frame by frame with custom delay
  • Autoplay Mode: Animate the full trajectory with adjustable speed
  • Progress Tracking: Real-time progress bar during visualization
  • Reset Option: Reset playback to timestamp 0
  • 3D Visualization: Interactive trajectory and landmarks with Plotly

Result and Output

2D-Mapping

Figure 1: IMU Localization from EKF Prediction

Screen Shot 2022-05-09 at 21 22 04

Figure 2: Estimated Trajectory with EKF Landmark Mapping

Screen Shot 2022-05-09 at 21 21 22

3D Real-time Tracking and Mapping

Figure 3: Status Bar and Timestamp Adjustment

Figure 4: SLAM Pose and Estimation

Key Contributions and Significance

This tool bridges advanced robotics estimation with practical diagnostics and research visualization. It provides an original and impactful contribution to the field of robotics and AI-driven localization and mapping.

  • EKF prediction with SE(3) motion modeling
  • Landmark update using stereo camera geometry
  • Real-time 3D animation with Plotly
  • Integration of advanced SLAM logic with a user-driven interface

About

Implement an Extended Kalman Filter to track the three dimensional position and orientation of a robot using gyroscope, accelerometer, and camera measurements.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages