Abstract
This paper describes an algorithm for the probabilistic fusion of sensor data from a variety of modalities (inertial, kinematic and LIDAR) to produce a single consistent position estimate for a walking humanoid. Of specific interest is our approach for continuous LIDAR-based localization which maintains reliable drift-free alignment to a prior map using a Gaussian Particle Filter. This module can be bootstrapped by constructing the map on-the-fly and performs robustly in a variety of challenging field situations. We also discuss a two-tier estimation hierarchy which preserves registration to this map and other objects in the robot's vicinity while also contributing to direct low-level control of a Boston Dynamics Atlas robot. Extensive experimental demonstrations illustrate how the approach can enable the humanoid to walk over uneven terrain without stopping (for tens of minutes), which would otherwise not be possible. We characterize the performance of the estimator for each sensor modality and discuss the computational requirements.
Original language | English |
---|---|
Title of host publication | Humanoid Robots (Humanoids), 2014 14th IEEE-RAS International Conference on |
Publisher | Institute of Electrical and Electronics Engineers (IEEE) |
Pages | 112-119 |
Number of pages | 8 |
DOIs | |
Publication status | Published - 1 Nov 2014 |
Keywords
- humanoid robots
- legged locomotion
- optical radar
- particle filtering (numerical methods)
- robot kinematics
- sensor fusion
- Boston Dynamics Atlas robot
- Gaussian particle filter
- LIDAR sensing
- continuous LIDAR-based localization
- drift-free alignment
- drift-free humanoid state estimation
- inertial sensing
- kinematic sensing
- sensor data probabilistic fusion
- walking humanoid position estimate
- Foot
- Joints
- Laser radar
- Legged locomotion
- Pelvis
- Robot sensing systems