To read this content please select one of the options below:

SiLK-SLAM: accurate, robust and versatile visual SLAM with simple learned keypoints

Jianjun Yao (College of Mechanical and Electrical Engineering, Harbin Engineering University, Harbin, China)
Yingzhao Li (College of Mechanical and Electrical Engineering, Harbin Engineering University, Harbin, China)

Industrial Robot

ISSN: 0143-991x

Article publication date: 11 March 2024

96

Abstract

Purpose

Weak repeatability is observed in handcrafted keypoints, leading to tracking failures in visual simultaneous localization and mapping (SLAM) systems under challenging scenarios such as illumination change, rapid rotation and large angle of view variation. In contrast, learning-based keypoints exhibit higher repetition but entail considerable computational costs. This paper proposes an innovative algorithm for keypoint extraction, aiming to strike an equilibrium between precision and efficiency. This paper aims to attain accurate, robust and versatile visual localization in scenes of formidable complexity.

Design/methodology/approach

SiLK-SLAM initially refines the cutting-edge learning-based extractor, SiLK, and introduces an innovative postprocessing algorithm for keypoint homogenization and operational efficiency. Furthermore, SiLK-SLAM devises a reliable relocalization strategy called PCPnP, leveraging progressive and consistent sampling, thereby bolstering its robustness.

Findings

Empirical evaluations conducted on TUM, KITTI and EuRoC data sets substantiate SiLK-SLAM’s superior localization accuracy compared to ORB-SLAM3 and other methods. Compared to ORB-SLAM3, SiLK-SLAM demonstrates an enhancement in localization accuracy even by 70.99%, 87.20% and 85.27% across the three data sets. The relocalization experiments demonstrate SiLK-SLAM’s capability in producing precise and repeatable keypoints, showcasing its robustness in challenging environments.

Originality/value

The SiLK-SLAM achieves exceedingly elevated localization accuracy and resilience in formidable scenarios, holding paramount importance in enhancing the autonomy of robots navigating intricate environments. Code is available at https://github.com/Pepper-FlavoredChewingGum/SiLK-SLAM.

Keywords

Acknowledgements

Funding: The authors did not receive support from any organization for the submitted work.

Competing interests: The authors have no relevant financial or nonfinancial interests to disclose.

Citation

Yao, J. and Li, Y. (2024), "SiLK-SLAM: accurate, robust and versatile visual SLAM with simple learned keypoints", Industrial Robot, Vol. ahead-of-print No. ahead-of-print. https://doi.org/10.1108/IR-11-2023-0309

Publisher

:

Emerald Publishing Limited

Copyright © 2024, Emerald Publishing Limited

Related articles