This repository contains a ROS2 + PX4 SITL + Gazebo stack for a quadrotor MPC with LiDAR-based obstacle avoidance. The MPC runs inside ROS2, consumes PX4 odometry and LaserScan data, and outputs trajectory setpoints with velocity feedforward.
drone_ws/src/drone/drone/mpc_core.py- Clean CasADi MPC solver with obstacle-aware reference trajectorydrone_ws/src/drone/drone/mpc_obstacle_avoidance_node_v2.py- ROS2 node with direct PX4 control and velocity feedforwarddrone_ws/src/drone/drone/mpc_visualizer.py- Real-time 3D visualization tooldrone_ws/src/drone/launch/mpc_obstacle_avoidance_v2.launch.py- Standalone launch file (no separate commander needed)
drone_ws/src/drone/drone/mpc_obstalce_avoidance_node.py- Original ROS2 node (requires separate commander)drone_ws/src/drone/drone/mpc_solver.py- Original CasADi MPC formulationdrone_ws/src/drone/drone/mpc_mission_commander.py- Offboard commander that tracks MPC trajectorydrone_ws/src/drone/drone/hover_enable_commander.py- Hover gate that enables MPC once altitude is stable
drone_ws/scripts/plot_topics.pyanddrone_ws/scripts/plot_csv.py- Plotters for rosbag and CSV data
The MPC v2 is a complete rewrite addressing issues found in v1. Key improvements:
| Feature | v1 | v2 |
|---|---|---|
| PX4 Control | Via separate commander node | Direct TrajectorySetpoint |
| Velocity | Position-only commands | Velocity feedforward |
| Obstacle Constraints | Half-space constraints (can fail) | Distance-based soft constraints |
| Reference Trajectory | Straight line or reference built from the current drone position | Potential field obstacle-aware |
| Architecture | Multiple nodes | Single self-contained node |
| Speed | Slow (~0.1 m/s) | Fast (~2-3 m/s) |
flowchart TB
subgraph node["mpc_obstacle_avoidance_node_v2"]
direction TB
subgraph processing["Processing Pipeline"]
direction LR
LP["🔍 LidarProcessor<br/><i>clustering</i>"]
GN["🧭 APF (Artificial Potential Field)-GapDetector<br/><i>path planning</i>"]
MPC["⚡ MPCSolver<br/><i>optimization</i>"]
LP --> GN --> MPC
end
subgraph px4ctrl["Direct PX4 Control Interface"]
TS["📍 TrajectorySetpoint<br/>position + velocity + yaw"]
OCM["🎮 OffboardControlMode<br/>at 10Hz"]
VC["🚁 VehicleCommand<br/>arm, offboard mode"]
end
processing --> px4ctrl
end
subgraph px4["PX4 SITL"]
CTRL["Flight Controller"]
end
subgraph sensors["Sensors"]
ODOM["/fmu/out/vehicle_odometry"]
SCAN["/scan - LiDAR"]
end
node -->|"/fmu/in/trajectory_setpoint"| px4
px4 --> ODOM
ODOM -->|"Position, Velocity, Orientation"| node
SCAN -->|"LaserScan"| node
style node fill:#e1f5fe,stroke:#01579b
style px4 fill:#fff3e0,stroke:#e65100
style processing fill:#f3e5f5,stroke:#7b1fa2
style px4ctrl fill:#e8f5e9,stroke:#2e7d32
The MPC uses an 8-state double-integrator model:
Unlike v1's straight-line reference, v2 uses a potential field approach to generate obstacle-aware references [1, 2]:
Where:
-
$\hat{d}_{attractive}$ = unit vector toward goal (attractive potential gradient) -
$F_{repulsive,i}$ = repulsive force from obstacle$i$ :
Where
References:
- E. Rimon and D. E. Koditschek, "Exact Robot Navigation Using Artificial Potential Functions", IEEE Transactions on Robotics and Automation, vol. 8, no. 5, pp. 501-518, 1992. [DOI]
- S. S. Ge and Y. J. Cui, "Dynamic Motion Planning for Mobile Robots Using Potential Field Method", Autonomous Robots, vol. 13, pp. 207-222, 2002. [DOI]
Instead of half-space constraints, v2 uses distance-based soft constraints:
Where:
-
$d_{ij}$ = Euclidean distance from predicted position$p_k$ to obstacle$j$ -
$r_s$ = safety radius (default 1.2m) -
$s_{ij}$ = slack variable (penalized heavily in cost)
This formulation:
- Never causes infeasibility (slack allows violation with high cost)
- Works in all directions (not just normal direction)
- Handles moving closer to obstacles gracefully
The key to fast response is velocity feedforward. The TrajectorySetpoint includes both position and velocity:
traj_msg.position = [x, y, z] # Where to go
traj_msg.velocity = [vx, vy, vz] # How fast to go (from MPC!)
traj_msg.yaw = yaw
traj_msg.yawspeed = yaw_rateThis tells PX4's controller the intended velocity, so it doesn't have to "guess" and ramp up slowly.
For complex environments like narrow corridors, v2 includes optional gap-based navigation (same algorithm as v1). Enable with:
ros2 launch drone mpc_obstacle_avoidance_v2.launch.py gap_navigation_enabled:=true| Parameter | Default | Description |
|---|---|---|
dt |
0.1 |
Time step [s] |
horizon |
20 |
Prediction horizon steps |
safety_radius |
1.2 |
Min distance from obstacles [m] |
emergency_radius |
0.6 |
Emergency brake distance [m] |
max_velocity |
3.0 |
Max velocity [m/s] |
max_acceleration |
4.0 |
Max acceleration [m/s²] |
max_yaw_rate |
1.5 |
Max yaw rate [rad/s] |
| Parameter | Default | Effect |
|---|---|---|
Q_pos |
50.0 |
Position tracking (higher = follows reference tighter) |
Q_vel |
0.5 |
Velocity penalty (lower = faster) |
R_acc |
0.05 |
Acceleration penalty (lower = more aggressive) |
Q_terminal |
200.0 |
Terminal cost (higher = reaches goal more precisely) |
| Parameter | Default | Description |
|---|---|---|
direct_px4_control |
true |
Send commands directly to PX4 |
velocity_feedforward |
true |
Include velocity in setpoint |
lookahead_index |
5 |
Trajectory index for position setpoint |
auto_arm |
true |
Automatically arm and switch to offboard |
# Build
cd ~/Drone-optimal-trajectory/drone_ws
colcon build --packages-select drone
source install/setup.bash
# Run (single command - no separate commander needed!)
ros2 launch drone mpc_obstacle_avoidance_v2.launch.py goal:='[10.0, 10.0, 1.5]'
# With custom parameters
ros2 launch drone mpc_obstacle_avoidance_v2.launch.py \
goal:='[10.0, 10.0, 1.5]' \
max_velocity:=4.0 \
safety_radius:=1.5 \
R_acc:=0.02
# Run 3D visualizer (in separate terminal)
ros2 run drone mpc_visualizerDrone too slow?
- Decrease
R_acc(e.g., 0.02) - Decrease
Q_vel(e.g., 0.1) - Increase
max_velocity(e.g., 4.0) - Increase
max_acceleration(e.g., 5.0)
Drone hits obstacles?
- Increase
safety_radius(e.g., 1.5) - Increase
emergency_radius(e.g., 0.8) - Decrease
max_velocity(e.g., 2.0)
Drone oscillates near obstacles?
- Increase
Q_posfor tighter reference tracking - Enable gap navigation for complex environments
The solver in mpc_solver.py uses:
with yaw rate
mpc_obstalce_avoidance_node.py builds a reference trajectory in ENU by:
- Reusing the previous MPC solution if it is still close to the current state.
- Otherwise, generating a straight-line segment to the current goal at a constant desired speed.
Yaw reference is aligned with the line from the current position to the goal.
For each step, the solver builds half-space constraints from LiDAR points:
where m_planes) within a range are used to build the half-spaces.
The MPC uses a two-layer approach for obstacle avoidance. While the half-space constraints handle low-level collision avoidance, a gap-based navigation layer provides high-level path planning through cluttered environments. This is essential for handling non-convex obstacle configurations (e.g., a cylinder blocking the direct path where the drone must choose to go left or right).
The gap detection analyzes the LiDAR scan to find free corridors:
-
Identify free directions: A scan ray is marked as "free" if:
- Range exceeds 90% of
max_obs_range(no obstacle detected) - Range is invalid/infinite (clear path)
- Range is below minimum threshold (sensor noise)
- Range exceeds 90% of
-
Group consecutive free rays into gaps: The algorithm sweeps through the scan and groups consecutive free directions. Each gap is characterized by:
- Start/end angles (angular extent)
- Center angle (middle direction)
- Minimum range (depth of the gap)
- Angular width
-
Filter by minimum width: Gaps narrower than
gap_min_width(default 1.0m at max range) are discarded. -
Handle wrap-around: If a gap spans the scan boundaries (e.g., -π to +π), the algorithm merges the first and last gaps.
When the direct path to the goal is blocked, the algorithm scores each gap:
where:
-
Alignment (0 to 1): How well the gap center points toward the goal direction
$$\text{alignment} = \frac{\pi - |\theta_{gap} - \theta_{goal}|}{\pi}$$ -
Quality: Combined measure of gap width and depth
$$\text{quality} = \frac{w_{gap}}{\pi} \cdot \frac{r_{min}}{r_{max}}$$ -
$w_{align}$ isgap_alignment_weight(default 0.85, strongly preferring goal direction)
Additional scoring modifiers:
- Gaps >90° off goal direction: 80% penalty
- Gaps >60° off goal direction: 50% penalty
- Gaps <30° off goal direction: 30% bonus
Once the best gap is selected, an intermediate goal is placed through it:
where gap_goal_distance,
To prevent oscillation between gaps:
- Hysteresis: The intermediate goal only changes if the new position differs by more than
gap_hysteresis(default 1.0m) - Spin detection: If the drone is rotating fast (>0.3 rad/s) but moving slow (<0.5 m/s), the intermediate goal is locked
- Direct path check: A cone of ±17° around the goal direction is checked; if clear beyond
direct_path_threshold, gap navigation is bypassed
| Parameter | Default | Description |
|---|---|---|
gap_nav_enabled |
true |
Enable/disable gap-based navigation |
gap_min_width |
1.0 |
Minimum gap width in meters |
gap_goal_distance |
3.0 |
Distance to place intermediate goal |
gap_alignment_weight |
0.85 |
Weight for goal alignment vs gap quality |
direct_path_threshold |
3.0 |
Min clear distance to use direct path |
gap_hysteresis |
1.0 |
Min change in goal to switch gaps |
The terminal cost applies additional weight to the final position and yaw error.
- Dynamics equality constraints for each step.
- Altitude bounds:
$z_{min} \le z_k \le z_{max}$ (including terminal). - Velocity and yaw rate bounds.
- Acceleration and yaw acceleration bounds.
- Obstacle half-space constraints with non-negative slack.
The MPC is solved with CasADi + IPOPT. Decision variables are bounded explicitly (state, input, and slack), and a warm start is initialized from the reference trajectory.
- Docker environment set up (see docker/ directory)
- PX4 SITL + Gazebo running
- Micro-XRCE-DDS-Agent running
- ROS2 Humble workspace built
# Allow GUI forwarding from containers (on host)
xhost +
cd /path/to/Drone-optimal-trajectory/docker/PX4-dev-env/full_docker_setup
# Build and start the docker-compose service
docker-compose build
docker-compose up -d
# On the host (not in the container), open QGC in a new terminal
cd /path/to/Drone-optimal-trajectory/QGC
./QGroundControl-x86_64.AppImage
# Enter the px4 container in a new terminal
docker-compose exec px4-stack bash
# Inside the container, start the 2D lidar drone simulation with walls
make px4_sitl gz_x500_lidar_2d_walls
# (or for simple environment without obstacles)
make px4_sitl gz_x500_lidar_2d
# In another container terminal, run the micro-ROS agent
cd /Micro-XRCE-DDS-Agent/build
MicroXRCEAgent udp4 -p 8888
# In another container terminal, launch RViz2 with the laser scan bridge
cd /workspace/
source /opt/ros/humble/setup.bash
source install/setup.bash
ros2 launch drone laser_bridge.launch.py
# In another container terminal, run MPC v2 (RECOMMENDED)
cd /workspace/
source /opt/ros/humble/setup.bash
source install/setup.bash
ros2 launch drone mpc_obstacle_avoidance_v2.launch.py goal:='[10.0, 10.0, 1.5]'
# (Optional) In another terminal, run the 3D visualizer
ros2 run drone mpc_visualizerIf you need to use the legacy MPC v1 system:
# In a container terminal, run the MPC v1 planner + hover gate + commander
cd /workspace/
source /opt/ros/humble/setup.bash
source install/setup.bash
ros2 launch drone path_planning.launch.pyThe path_planning.launch.py launch file starts:
hover_enable_commander(arms, holds hover, then publishes/mpc/enable)mpc_obstacle_avoidance(waits for enable, then optimizes)mpc_mission_commander(waits for enable, then tracks the MPC path)
Once you launch the laser_bridge:
- Set the fixed frame to
world - Add
LaserScandisplay subscribing to/scan - Add
Pathdisplays for/mpc/predicted_pathand/mpc/reference_path - Add
PoseStampedfor/mpc/active_goaland/mpc/final_goal
To generate the csv files from rosbag data, run the following script:
cd /workspace/scripts
./plot_topics.py --backend csv --out ./plot_dataTo plot from CSV data:
cd /workspace/scripts
./plot_csv.py --dir ./plot_data- Decrease
R_accparameter (e.g.,R_acc:=0.02) - Ensure
velocity_feedforward:=trueis set - Increase
max_velocity(e.g.,max_velocity:=4.0)
- Increase
safety_radius(e.g.,safety_radius:=1.5) - Decrease
max_velocityfor more reaction time - Check that LiDAR is publishing on
/scan
- Check obstacle count in logs (should be <30 after clustering)
- Increase
max_obstacle_rangeif needed - Check for invalid odometry data
- Ensure QGroundControl is connected
- Check that
auto_arm:=trueis set - Verify offboard control mode messages are being sent