Multi-Sensor Fusion
Overview
Multi-sensor fusion is the comprehensive processing of data from different sensors to achieve more accurate and robust state estimation than any single sensor alone. It is a core technology in robot navigation and autonomous driving systems.
Fusion Levels
graph TD
A[Sensor Fusion Levels] --> B[Data-Level Fusion]
A --> C[Feature-Level Fusion]
A --> D[Decision-Level Fusion]
B --> B1[Directly fuse raw data<br>e.g., point cloud + image coloring]
C --> C1[Fuse extracted features<br>e.g., VIO tight coupling]
D --> D1[Fuse independent results<br>e.g., multi-sensor voting]
| Level | Information Utilization | Computation | Flexibility | Example |
|---|---|---|---|---|
| Data-level | Highest | Heaviest | Low | Point cloud-image fusion |
| Feature-level | High | Medium | Medium | VIO, LIO |
| Decision-level | Low | Light | High | Loosely-coupled fusion |
Extended Kalman Filter (EKF)
Standard EKF
The EKF is the most classic nonlinear state estimation method, linearizing nonlinear systems through first-order Taylor expansion.
System model:
Prediction step:
Where \(\mathbf{F}_k = \left.\frac{\partial f}{\partial \mathbf{x}}\right|_{\hat{\mathbf{x}}_{k-1}}\) is the state transition Jacobian.
Update step:
Where \(\mathbf{H}_k = \left.\frac{\partial h}{\partial \mathbf{x}}\right|_{\hat{\mathbf{x}}_{k|k-1}}\) is the observation Jacobian.
EKF Fusion Flow
graph TD
A["Initialize<br>x0, P0"] --> B{"Receive Data"}
B --> |"IMU (high freq)"| C["Prediction Step<br>x = f(x, u)<br>P = FPF' + Q"]
B --> |"GPS/LiDAR/Vision (low freq)"| D["Update Step<br>K = PH'(HPH'+R)^-1<br>x = x + K(z - h(x))<br>P = (I-KH)P"]
C --> B
D --> B
C --> E["Output State Estimate"]
D --> E
Multi-Sensor EKF Update
With multiple sensors, sequential updates can be applied:
Each sensor's update only affects the state components related to its observation.
Unscented Kalman Filter (UKF)
Motivation
EKF uses first-order linearization, which is insufficient for strongly nonlinear systems. UKF uses Sigma points (unscented transform) to better approximate the distribution after nonlinear transformation.
Sigma Point Generation
For an \(n\)-dimensional state, generate \(2n+1\) Sigma points:
Where \(\lambda = \alpha^2(n + \kappa) - n\) is the scaling parameter.
UKF vs. EKF
| Feature | EKF | UKF |
|---|---|---|
| Linearization | Jacobian (first-order) | Sigma points (second-order) |
| Accuracy | Good for weak nonlinearity | Good for strong nonlinearity |
| Implementation | Requires Jacobian derivation | No Jacobians needed |
| Computation | Lower | Slightly higher (\(2n+1\) function evaluations) |
| Suitable For | Most cases | Strongly nonlinear systems |
Complementary Filter
Principle
The complementary filter is the simplest sensor fusion method, particularly suited for IMU attitude estimation. It leverages the low-frequency accuracy of the accelerometer and the high-frequency accuracy of the gyroscope:
Where:
- \(\alpha\) is the filter coefficient (typically 0.95-0.99)
- \(\omega_k\) is the gyroscope angular velocity
- \(\theta_{\text{accel},k}\) is the tilt angle computed from the accelerometer
Frequency Domain Understanding
graph LR
A[Gyroscope<br>Angular Velocity] --> B[Integration] --> C[High-Pass Filter<br>alpha]
D[Accelerometer<br>Tilt Angle] --> E[Low-Pass Filter<br>1-alpha]
C --> F[Sum]
E --> F
F --> G[Fused Attitude Angle]
The complementary filter essentially:
- High-pass filters gyroscope data (trusts high-frequency components, removes drift)
- Low-pass filters accelerometer data (trusts low-frequency components, removes vibration noise)
Implementation
class ComplementaryFilter:
def __init__(self, alpha=0.98, dt=0.01):
self.alpha = alpha
self.dt = dt
self.roll = 0.0
self.pitch = 0.0
def update(self, gyro, accel):
"""
gyro: (gx, gy, gz) rad/s
accel: (ax, ay, az) m/s^2
"""
import math
# Tilt from accelerometer
accel_roll = math.atan2(accel[1], accel[2])
accel_pitch = math.atan2(-accel[0],
math.sqrt(accel[1]**2 + accel[2]**2))
# Complementary filter
self.roll = self.alpha * (self.roll + gyro[0] * self.dt) \
+ (1 - self.alpha) * accel_roll
self.pitch = self.alpha * (self.pitch + gyro[1] * self.dt) \
+ (1 - self.alpha) * accel_pitch
return self.roll, self.pitch
Complementary Filter vs. EKF
The complementary filter is simple to implement and computationally minimal, suitable for embedded systems. However, it does not provide uncertainty estimates and is difficult to extend to multiple sensors. For complex systems, EKF/UKF is the better choice.
ROS2 robot_localization
robot_localization is the most commonly used sensor fusion package in ROS2, providing EKF and UKF implementations.
Basic Architecture
graph TD
A[IMU<br>/imu/data] --> E[ekf_node]
B[Wheel Odometry<br>/odom] --> E
C[GPS<br>/gps/fix] --> F[navsat_transform_node]
F --> E
D[Visual Odometry<br>/vo] --> E
E --> G[/odometry/filtered<br>Fused Pose]
E --> H[/tf<br>odom -> base_link]
Configuration Example
# ekf.yaml
ekf_node:
ros__parameters:
frequency: 50.0
sensor_timeout: 0.1
two_d_mode: false
map_frame: map
odom_frame: odom
base_link_frame: base_link
world_frame: odom
# IMU configuration
imu0: /imu/data
imu0_config: [false, false, false, # x, y, z
true, true, true, # roll, pitch, yaw
false, false, false, # vx, vy, vz
true, true, true, # vroll, vpitch, vyaw
true, true, true] # ax, ay, az
imu0_differential: false
imu0_remove_gravitational_acceleration: true
# Wheel odometry configuration
odom0: /odom
odom0_config: [true, true, false, # x, y, z
false, false, false, # roll, pitch, yaw
false, false, false, # vx, vy, vz
false, false, true, # vroll, vpitch, vyaw
false, false, false] # ax, ay, az
# Process noise covariance
process_noise_covariance: [0.05, 0.0, ...]
Dual EKF Configuration (with GPS)
# Local EKF: odom frame
ekf_local:
ros__parameters:
world_frame: odom
odom0: /wheel_odom
imu0: /imu/data
# Global EKF: map frame
ekf_global:
ros__parameters:
world_frame: map
odom0: /odometry/gps # from navsat_transform
imu0: /imu/data
# GPS coordinate conversion
navsat_transform:
ros__parameters:
magnetic_declination_radians: 0.0
yaw_offset: 0.0
broadcast_utm_transform: true
Launch
ros2 launch robot_localization dual_ekf_navsat.launch.py
Common Fusion Configurations
Indoor Mobile Robot
| Sensor | Information Provided | Fusion Method |
|---|---|---|
| Wheel encoders | Velocity, odometry | EKF prediction |
| IMU | Angular velocity, acceleration | EKF prediction + update |
| 2D LiDAR | Position (SLAM) | EKF update |
Outdoor Autonomous Driving
| Sensor | Information Provided | Fusion Method |
|---|---|---|
| IMU | High-frequency motion | EKF prediction |
| RTK-GPS | Global position | EKF update |
| 3D LiDAR | Relative pose, obstacles | EKF update |
| Camera | Semantics, lane lines | Feature-level fusion |
| Wheel speed | Velocity | EKF update |
Drones
| Sensor | Information Provided | Fusion Method |
|---|---|---|
| IMU | High-frequency attitude/acceleration | EKF prediction |
| GPS | Position | EKF update |
| Barometer | Altitude | EKF update |
| Optical flow | Velocity (low altitude) | EKF update |
| Magnetometer | Heading | EKF update |
Advanced Fusion Methods
Factor Graph Optimization
Factor graphs model the fusion problem as a probabilistic graphical model, solved through nonlinear optimization:
Advantages: Can leverage all historical information, supports loop closure detection, higher accuracy.
Common tools: GTSAM, g2o, Ceres Solver.
Particle Filter
Suitable for multimodal distributions and strongly nonlinear systems. Commonly used for robot global localization (AMCL).
Deep Learning Fusion
Using neural networks to learn sensor fusion strategies:
- PointPainting: Projects image semantics onto point clouds
- BEVFusion: Fuses camera and LiDAR features in BEV space
- TransFusion: Transformer-based multimodal fusion
Debugging and Troubleshooting
| Symptom | Possible Cause | Investigation |
|---|---|---|
| Pose divergence | Improper process/observation noise settings | Check covariance matrices |
| Pose jumps | Sensor timestamps not synchronized | Check time synchronization |
| Drift during in-place rotation | IMU bias not calibrated | Static gyro bias calibration |
| GPS jumps | Multipath/satellite switching | Increase GPS observation noise |
| Altitude drift | Barometer/GPS altitude unstable | Constrain altitude to 0 (ground robots) |
References
- Probabilistic Robotics - Thrun, Burgard, Fox
- State Estimation for Robotics - Tim Barfoot
- ROS2 robot_localization Documentation: https://docs.ros.org/en/humble/p/robot_localization/
- GTSAM Documentation: https://gtsam.org
- Sola, "Quaternion kinematics for the error-state Kalman filter", 2017