2.845

2023影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

LiDAR/IMU紧耦合的实时定位方法

李帅鑫 李广云 王力 杨啸天

李帅鑫, 李广云, 王力, 杨啸天. LiDAR/IMU紧耦合的实时定位方法. 自动化学报, 2021, 47(6): 1377−1389 doi: 10.16383/j.aas.c190424
引用本文: 李帅鑫, 李广云, 王力, 杨啸天. LiDAR/IMU紧耦合的实时定位方法. 自动化学报, 2021, 47(6): 1377−1389 doi: 10.16383/j.aas.c190424
Li Shuai-Xin, Li Guang-Yun, Wang Li, Yang Xiao-Tian. LiDAR/IMU tightly coupled real-time localization method. Acta Automatica Sinica, 2021, 47(6): 1377−1389 doi: 10.16383/j.aas.c190424
Citation: Li Shuai-Xin, Li Guang-Yun, Wang Li, Yang Xiao-Tian. LiDAR/IMU tightly coupled real-time localization method. Acta Automatica Sinica, 2021, 47(6): 1377−1389 doi: 10.16383/j.aas.c190424

LiDAR/IMU紧耦合的实时定位方法

doi: 10.16383/j.aas.c190424
基金项目: 地理信息工程国家重点实验室基金(SKLGIE2018-M-3-1), 国家重点研发计划(2017YFF0206001), 国家自然科学基金(41501491)资助
详细信息
    作者简介:

    李帅鑫:战略支援部队信息工程大学地理空间信息学院博士研究生. 2015年获得中南大学测绘工程学士学位, 2018年获得战略支援部队信息工程大学控制科学与工程硕士学位. 主要研究方向为多传感器融合的SLAM, 移动测量. E-mail: lsx_navigation@sina.com

    李广云:战略支援部队信息工程大学地理空间信息学院教授. 1987年获得解放军测绘学院测绘科学与技术硕士学位. 主要研究方向为精密工程与工业测量, 导航应用及导航定位与位置服务. 本文通信作者.E-mail: guangyun_li@163.com

    王力:战略支援部队信息工程大学地理空间信息学院讲师. 2014年获解放军信息工程大学地理空间信息学院测绘科学与技术博士学位. 主要研究方向为点云数据处理, 移动测量与三维重建. E-mail: wangli_chxy@163.com

    杨啸天:2020年获解放军信息工程大学地理空间信息学院控制科学与工程硕士学位. 主要研究方向为移动测量技术. E-mail: esis@foxmail.com

LiDAR/IMU Tightly Coupled Real-time Localization Method

Funds: Supported by State Key Laboratory of Geo-Information Engineering(SKLGIE2018-M-3-1), National Key Research and Development Project(2017YFF0206001), National Natural Science Foundation of China(41501491)
More Information
    Author Bio:

    LI Shuai-Xin Ph. D. candidate at the College of Geospatial Information, PLA Information Engineering University. He received his bachelor degree in surveying and mapping engineering from Central South University in 2015, and the master degree in control science and engineering from PLA Information Engineering University in 2018. His research interest covers multi-sensor fused SLAM, mobile mapping

    LI Guang-Yun Professor at the College of Geospatial Information, PLA Information Engineering University. He received his master degree in surveying and mapping science from PLA Institute of Surveying and Mapping in 1987. His research interest covers precise engineering and industry measurement, navigation and location services and applications. Corresponding author of this paper

    WANG Li Lecturer at the College of Geospatial Infor-mation, PLA Information Engineering University. He received his Ph. D. degree in surveying and mapping science from PLA Information Engineering University in 2014. His research interest covers point cloud data processing, mobilemapping and 3D reconstruction

    YANG Xiao-Tian He received his master degree in control science and engineering from PLA Information Engineering University in 2020. His main research interest is the mobile mapping system

  • 摘要: 本文以实现移动小型智能化系统的实时自主定位为目标, 针对激光里程计误差累计大, 旋转估计不稳定, 以及观测信息利用不充分等问题, 提出一种LiDAR/IMU紧耦合的实时定位方法 — Inertial-LOAM. 数据预处理部分, 对IMU数据预积分, 降低优化变量维度, 并为点云畸变校正提供参考. 提出一种基于角度图像的快速点云分割方法, 筛选结构性显著的点作为特征点, 降低点云规模, 保证激光里程计的效率; 针对地图构建部分存在的地图匹配点搜索效率低和离散点云地图的不完整性问题, 提出传感器中心的多尺度地图模型, 利用环形容器保持地图点恒定, 并结合多尺度格网保证地图模型中点的均匀分布. 数据融合部分, 提出LiDAR/IMU紧耦合的优化方法, 将IMU和LiDAR构成的预积分因子、配准因子、闭环因子插入全局因子图中, 采用基于贝叶斯树的因子图优化算法对变量节点进行增量式优化估计, 实现数据融合. 最后, 采用实测数据评估Inertial-LOAM的性能并与LeGO-LOAM, LOAM和Cartographer对比. 结果表明, Inertial-LOAM在不明显增加运算负担的前提下大幅降低连续配准误差造成的误差累计, 具有良好的实时性; 在结构性特征明显的室内环境, 定位精度达厘米级, 与对比方法持平; 在开阔的室外环境, 定位精度达分米级, 而对比方法均存在不同程度的漂移.
  • 图  1  系统框架图

    Fig.  1  The overview of the system

    图  2  点云分割示例

    Fig.  2  Example of point cloud segmentation

    图  3  IMU与LiDAR的频率关系

    Fig.  3  Frequencies of IMU and LiDAR

    图  4  局部地图示意图

    Fig.  4  Demonstration for the local map

    图  5  因子图结构

    Fig.  5  Structure of the factor graph

    图  6  数据采集平台

    Fig.  6  Data collection platform

    图  10  Inertial-LOAM轨迹及建图结果

    Fig.  10  Trajectory and mapping result of Inertial-LOAM

    图  7  系统运行时间对比

    Fig.  7  Comparison of time cost of two systems

    图  8  闭环优化效果

    Fig.  8  Performance of loop optimization

    图  9  室外开阔环境IL/LL/L/C轨迹结果对比

    Fig.  9  Comparison of pose estimation of IL/LL/L/C in outdoor environment

    表  1  累计误差结果

    Table  1  Error accumulation result

    场景方法横滚 (°)俯仰 (°)航向 (°)角度偏差 (°)X方向 (m)Y方向 (m)Z方向 (m)位置偏差 (m)
    2# 数据[11]IMU0.7481.0180.5981.39835.09584.652−665.782672.059
    Cartographer0.113−0.7090.9891.2220.4051.3170.6701.532
    LOAM0.0160.1410.9250.9360.3160.3490.0250.471
    LeGO-LOAM0.0610.0810.9160.9210.0680.3380.1150.364
    Inertial-LOAM0.0130.0260.9170.9180.0610.2580.0230.266
    室内环境Cartographer0.003−0.0010.0170.0170.0230.0370.0280.052
    LOAM0.0010.0040.0680.0680.0320.0830.0320.095
    LeGO-LOAM−0.006−0.002−0.0210.0220.0160.047−0.0320.059
    Inertial-LOAM−0.0080.001−0.0200.0210.0210.0430.0270.055
    室外环境Cartographer0.075−0.0240.0810.1131.7472.592−0.4493.158
    LOAM−0.0310.0060.0960.1010.04672.368−0.0652.353
    LeGO-LOAM−0.024−0.5430.0410.545−19.857−14.914−0.35524.836
    Inertial-LOAM0.006−0.0800.0030.080−0.310−0.100−0.0300.328
    下载: 导出CSV
  • [1] 李帅鑫. 激光雷达/相机组合的3D SLAM技术研究 [硕士学位论文], 战略支援部队信息工程大学, 中国, 2018

    Li Shuai-Xin. Research on 3D SLAM Based on Lidar/Camera Coupled System [Master thesis], PLA Strategic Support Force Information Engineering University, China, 2018
    [2] Besl P J, McKay N D. Method for registration of 3-D shapes. In: Proceedings Volume 1611, Sensor Fusion IV: Control Paradigms and Data Structures. Boston, MA, USA: SPIE, 1992. 586−606
    [3] Pomerleau F, Colas F, Siegwart R, Magnenat S. Comparing ICP variants on real-world data sets. Autonomous Robots, 2013, 34(3): 133−148 doi: 10.1007/s10514-013-9327-2
    [4] Surmann H, Nűchter A, Lingemann K, Hertzberg J. 6D SLAM-preliminary report on closing the loop in six dimensions. IFAC Proceedings Volumes, 2004, 37(8): 197−202 doi: 10.1016/S1474-6670(17)31975-4
    [5] Moosmann F, Stiller C. Velodyne SLAM. In: Proceedings of the 2011 IEEE Intelligent Vehicles Symposium (IV). Baden-Baden, Germany: IEEE, 2011. 393−398
    [6] Droeschel D, Schwarz M, Behnke S. Continuous mapping and localization for autonomous navigation in rough terrain using a 3D laser scanner. Robotics and Autonomous Systems, 2017, 88: 104−115 doi: 10.1016/j.robot.2016.10.017
    [7] Droeschel D, Behnke S. Efficient continuous-time SLAM for 3D lidar-based online mapping. In: Proceedings of the 2018 IEEE International Conference on Robotics and Automation (ICRA). Brisbane, QLD, Australia: IEEE, 2018. 5000−5007
    [8] Zhang J, Singh S. LOAM: Lidar odometry and mapping in real-time. In: Proceedings of Robotics: Science and Systems. Berkeley, CA, USA: 2014.
    [9] Zhang J, Singh S. Low-drift and real-time lidar odometry and mapping. Autonomous Robots, 2017, 41(2): 401−416 doi: 10.1007/s10514-016-9548-2
    [10] Geiger A, Lenz P, Urtasun R. Are we ready for autonomous driving? The KITTI vision benchmark suite. In: Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition. Providence, RI, USA: IEEE, 2012. 3354−3361
    [11] Shan T X, Englot B. LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain. In: Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Madrid, Spain: IEEE, 2018. 4758−4765
    [12] Hess W, Kohler D, Rapp H, Andor D. Real-time loop closure in 2D LIDAR SLAM. In: Proceedings of the 2016 IEEE International Conference on Robotics and Automation (ICRA). Stockholm, Sweden: IEEE, 2016. 1271−1278
    [13] Forster C, Carlone L, Dellaert F, Scaramuzza D. On-manifold preintegration for real-time visual-inertial odometry. IEEE Transactions on Robotics, 2017, 33(1): 1−21 doi: 10.1109/TRO.2016.2597321
    [14] Sarvrood Y B, Hosseinyalamdary S, Gao Y. Visual-LiDAR odometry aided by reduced IMU. ISPRS International Journal of Geo-Information, 2016, 5(1): 3 doi: 10.3390/ijgi5010003
    [15] Thrun S, Burgard W, Fox D. Probabilistic Robotics. Cambridge, MA: MIT Press, 2005.
    [16] Hening S, Ippolito C A, Krishnakumar K S, Stepanyan V, Teodorescu M. 3D LIDAR SLAM integration with GPS/INS for UAVs in urban GPS-degraded environments. In: AIAA Information Systems-AIAA Infotech@Aerospace. Grapevine, Texas: AIAA, 2017.
    [17] Dellaert F, Kaess M. Factor graphs for robot perception. Foundations and Trends® in Robotics, 2017, 6(1-2): 1−139
    [18] Leutenegger S, Lynen S, Bosse M, Siegwart R, Furgale P. Keyframe-based visual-inertial odometry using nonlinear optimization. The International Journal of Robotics Research, 2015, 34(3): 314−334 doi: 10.1177/0278364914554813
    [19] Konolige K, Grisetti G, Kűmmerle R, Burgard W, Limketkai B, Vincent R. Efficient sparse pose adjustment for 2D mapping. In: Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems. Taipei, China: IEEE, 2010. 22−29
    [20] Kaess M, Ranganathan A, Dellaert F. ISAM: Incremental smoothing and mapping. IEEE Transactions on Robotics, 2008, 24(6): 1365−1378 doi: 10.1109/TRO.2008.2006706
    [21] Indelman V, Williams S, Kaess M, Dellaert F. Factor graph based incremental smoothing in inertial navigation systems. In: Proceedings of the 15th International Conference on Information Fusion. Singapore: IEEE, 2012. 2154-2161
    [22] Kaess M, Johannsson H, Roberts R, Ila V, Leonard J J, Dellaert F. iSAM2: Incremental smoothing and mapping using the Bayes tree. The International Journal of Robotics Research, 2012, 31(2): 216−235 doi: 10.1177/0278364911430419
    [23] Qin T, Li P L, Shen S J. VINS-mono: A robust and versatile monocular visual-inertial state estimator. IEEE Transactions on Robotics, 2018, 34(4): 1004−1020 doi: 10.1109/TRO.2018.2853729
    [24] Qin T, Shen S J. Online temporal calibration for monocular visual-inertial systems. In: Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Madrid, Spain: IEEE, 2018. 3662−3669
    [25] Li S X, Li G Y, Zhou Y L, Wang L, Fu J Y. Real-time dead reckoning and mapping approach based on three-dimensional point cloud. In: proceedings of the 2018 China Satellite Navigation Conference. Harbin, China: Springer, 2018. 643−662
    [26] Barfoot T D. State Estimation for Robotics. Cambridge: Cambridge University Press, 2017.
    [27] Zhang J, Singh S. Laser-visual-inertial odometry and mapping with high robustness and low drift. Journal of Field Robotics, 2018, 35(8): 1242−1264 doi: 10.1002/rob.21809
    [28] Behley J, Stachniss C. Efficient surfel-based SLAM using 3D laser range data in urban environments. In: Proceedings of Robotics: Science and Systems. Pittsburgh, Pennsylvania, 2018.
  • 加载中
图(10) / 表(1)
计量
  • 文章访问数:  8779
  • HTML全文浏览量:  4261
  • PDF下载量:  654
  • 被引次数: 0
出版历程
  • 收稿日期:  2019-06-02
  • 录用日期:  2019-12-15
  • 网络出版日期:  2020-01-16
  • 刊出日期:  2021-06-10

目录

    /

    返回文章
    返回