Ego-motion estimation and localization with millimeter-wave scanning radar

In contrast to cameras, lidars, GPS, and proprioceptive sensors, radars are affordable and efficient systems that operate well under variable weather and lighting conditions, require no external infrastructure, and detect long-range objects. For these reasons, radar is a reliable and versatile senso...

Full description

Bibliographic Details
Main Author: Cen, S
Other Authors: Newman, P
Format: Thesis
Language:English
Published: 2020
Subjects:
Online Access:https://ora.ox.ac.uk/objects/uuid:211d137e-50c1-477b-b09a-abd6512162ad
Description
Summary:In contrast to cameras, lidars, GPS, and proprioceptive sensors, radars are affordable and efficient systems that operate well under variable weather and lighting conditions, require no external infrastructure, and detect long-range objects. For these reasons, radar is a reliable and versatile sensor well suited for ego-motion estimation and localization. However, various radar artifacts and characteristics— including speckle noise, receiver saturation, multipath reflections, low spatial res- olution, and slow update speeds—make it an especially challenging sensor with which to work. In this thesis, we present algorithms for radar-only odometry and localization for mobile autonomous systems. We focus primarily on odometry and show that our approach can be adapted for localization. Using a frequency-modulated continuous-wave (FMCW) scanning radar, we first extract keypoints from radar scans in a manner that avoids redundant returns, accounts for high levels of noise, and reduces reliance on the amplitude of received power as an indicator of occupancy. To estimate relative motion from the keypoint sets, we then perform scan matching by greedily adding point correspondences based on unary descriptors and pairwise compatibility scores in a graph matching framework. Our pipeline achieves reliable and accurate ego-motion estimation under challenging conditions and across a range of settings. The pipeline of our most recent paper requires only one input parameter and no outlier detection, no model-reliant motion filters, nor any additional sensors. Our system operates well even in the absence of a motion prior and is highly robust to radar artifacts. We demonstrate its ability to match the performance of visual odometry and GPS/INS, remaining precise and robust even under difficult conditions that cause the latter two systems to fail. Additionally, it adapts effortlessly to diverse environments, from urban UK to off-road Iceland, achieving an average scan matching accuracy of 5.20 cm and 0.0929 degrees when using GPS ...