Sensor fusion

Sensor fusion is combining of sensory data or data derived from disparate sources such that the resulting information has less uncertainty than would be possible when these sources were used individually.

In self-driving domain sensor fusion is the combination of LiDAR, radars, cameras and other sensors and autonomous driving applications which, when smartly bundled and set up, give autonomous vehicles an all-encompassing and thorough 360-degree view of the environment.

Extra reading:

Sensor Fusion: Technical challenges for Level 4-5 self-driving vehicles

Papers:

An Efficient Multi-sensor Fusion Approach for Object Detection in Maritime Environments

Multisensor Data Fusion: A Review of the State-of-the-art

Video:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Particle filter

Computer vision