To read this content please select one of the options below:

Fuzzy adaptive extended Kalman filter for robot 3D pose estimation

Hanieh Deilamsalehy (Department of Electrical and Computer Engineering, Michigan Technological University, Houghton, Michigan, USA)
Timothy C. Havens (Department of Electrical and Computer Engineering, Michigan Technological University, Houghton, Michigan, USA)

International Journal of Intelligent Unmanned Systems

ISSN: 2049-6427

Article publication date: 16 April 2018

190

Abstract

Purpose

Estimating the pose – position and orientation – of a moving object such as a robot is a necessary task for many applications, e.g., robot navigation control, environment mapping, and medical applications such as robotic surgery. The purpose of this paper is to introduce a novel method to fuse the information from several available sensors in order to improve the estimated pose from any individual sensor and calculate a more accurate pose for the moving platform.

Design/methodology/approach

Pose estimation is usually done by collecting the data obtained from several sensors mounted on the object/platform and fusing the acquired information. Assuming that the robot is moving in a three-dimensional (3D) world, its location is completely defined by six degrees of freedom (6DOF): three angles and three position coordinates. Some 3D sensors, such as IMUs and cameras, have been widely used for 3D localization. Yet, there are other sensors, like 2D Light Detection And Ranging (LiDAR), which can give a very precise estimation in a 2D plane but they are not employed for 3D estimation since the sensor is unable to obtain the full 6DOF. However, in some applications there is a considerable amount of time in which the robot is almost moving on a plane during the time interval between two sensor readings; e.g., a ground vehicle moving on a flat surface or a drone flying at an almost constant altitude to collect visual data. In this paper a novel method using a “fuzzy inference system” is proposed that employs a 2D LiDAR in a 3D localization algorithm in order to improve the pose estimation accuracy.

Findings

The method determines the trajectory of the robot and the sensor reliability between two readings and based on this information defines the weight of the 2D sensor in the final fused pose by adjusting “extended Kalman filter” parameters. Simulation and real world experiments show that the pose estimation error can be significantly decreased using the proposed method.

Originality/value

To the best of the authors’ knowledge this is the first time that a 2D LiDAR has been employed to improve the 3D pose estimation in an unknown environment without any previous knowledge. Simulation and real world experiments show that the pose estimation error can be significantly decreased using the proposed method.

Keywords

Acknowledgements

This material is based upon work supported by the Michigan Department of Transportation (MDOT), Grant No. 2016-0067. Superior, a high performance computing cluster at Michigan Technological University, was used in obtaining some of the results presented in this publication.

Citation

Deilamsalehy, H. and Havens, T.C. (2018), "Fuzzy adaptive extended Kalman filter for robot 3D pose estimation", International Journal of Intelligent Unmanned Systems, Vol. 6 No. 2, pp. 50-68. https://doi.org/10.1108/IJIUS-12-2017-0014

Publisher

:

Emerald Publishing Limited

Copyright © 2018, Emerald Publishing Limited

Related articles