This project focuses on analyzing sensor data collected from autonomous vehicles to extract meaningful insights about vehicle behavior, surrounding environments, and driving conditions. The goal is to leverage multi-sensor streams such as LiDAR, radar, cameras, IMU, and GPS to facilitate tasks like obstacle detection, trajectory prediction, anomaly detection, and environment perception.
Features:
Preprocessing and normalization of heterogeneous sensor datasets.
Sensor fusion from multiple data sources for comprehensive scene understanding.
Visualization of 3D point clouds, camera images and detected object overlays using Matplotlib
Time series analysis of sensor readings and vehicle telemetry.
Anomaly detection in sensor signals and driving patterns.