留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

二维激光雷达数据角点特征的提取

康俊民 赵祥模 杨荻

康俊民, 赵祥模, 杨荻. 二维激光雷达数据角点特征的提取[J]. 交通运输工程学报, 2018, 18(3): 228-238. doi: 10.19818/j.cnki.1671-1637.2018.03.023
引用本文: 康俊民, 赵祥模, 杨荻. 二维激光雷达数据角点特征的提取[J]. 交通运输工程学报, 2018, 18(3): 228-238. doi: 10.19818/j.cnki.1671-1637.2018.03.023
KANG Jun-min, ZHAO Xiang-mo, YANG Di. Corner feature extraction of 2D lidar data[J]. Journal of Traffic and Transportation Engineering, 2018, 18(3): 228-238. doi: 10.19818/j.cnki.1671-1637.2018.03.023
Citation: KANG Jun-min, ZHAO Xiang-mo, YANG Di. Corner feature extraction of 2D lidar data[J]. Journal of Traffic and Transportation Engineering, 2018, 18(3): 228-238. doi: 10.19818/j.cnki.1671-1637.2018.03.023

二维激光雷达数据角点特征的提取

doi: 10.19818/j.cnki.1671-1637.2018.03.023
基金项目: 

高等学校学科创新引智计划项目 B14043

详细信息
    作者简介:

    康俊民(1978-), 男, 四川绵阳人, 西安外国语大学讲师, 工学博士, 从事数据特征研究

    赵祥模(1966-), 男, 重庆人, 长安大学教授, 工学博士

  • 中图分类号: U491.5

Corner feature extraction of 2D lidar data

More Information
  • 摘要: 为增强无人车识别行驶环境中角点特征的鲁棒性, 并提高角点特征的识别速度, 基于观测点的二变量正态概率密度映射之间的相对差值, 提出了一种角点特征提取方法; 将观测数据组映射到二变量正态概率密度空间, 获得每个观测点的映射; 对映射结果进行归一化, 消除协方差引起的数值差异; 在映射数值曲线中寻找波峰与波谷的位置, 波峰对应的观测点最接近均值点, 波谷对应的观测点最接近拐点; 利用波峰和波谷的相对高度判定该组观测数据是否符合角点特征的边长要求; 用波谷对应的原始观测数据点坐标作为角点特征, 构建环境特征地图。试验结果表明: 提取方法能够处理观测点数大于63, 观测点角度分辨率大于1°的稀疏观测数据, 在大尺寸室外环境和室内环境中, 提取方法都能够稳定识别大型角点; 对小于180个点的观测数据, 最大处理时间小于5ms, 平均处理时间小于1.9ms, 提取方法减少了构建环境特征地图的时间; 提取方法依据观测数据的二变量正态概率密度提取角点特征, 对观测误差和角点特征的尺度与形状不敏感, 能够有效提高角点特征的识别鲁棒性。

     

  • 图  1  观测数据的二变量正态概率密度映射

    Figure  1.  Bivariate normal probability density mapping of observation data

    图  2  角点的观测结果

    Figure  2.  Observation result of corners

    图  3  距离和映射

    Figure  3.  Distances and mapping values

    图  4  夹角不同的二变量正态概率密度映射

    Figure  4.  Bivariate normal probability density mappings with different angles

    图  5  边长不同的二变量正态概率密度映射

    Figure  5.  Bivariate normal probability density mappings with different edge lengths

    图  6  相对高度与欧式距离

    Figure  6.  Relative heights and Euclidean distances

    图  7  波峰的相对高度

    Figure  7.  Relative height of peak

    图  8  角点提取流程

    Figure  8.  Flow of corner extraction

    图  9  室外角点提取结果1

    Figure  9.  Result of first corner extraction in outdoor

    图  10  室外角点提取结果2

    Figure  10.  Result of second corner extraction in outdoor

    图  11  室外峰值相对高度与特征数

    Figure  11.  Peaks'relative heights and feature numbers in outdoor

    图  12  角点特征分布

    Figure  12.  Distribution of corner features

    图  13  IntelCenter数据集中相对高度与角点数

    Figure  13.  Relative heights and feature numbers of IntelCenter set

    图  14  观测点数量与处理时间

    Figure  14.  Number of observation points and processing time

  • [1] HESS W, KOHLER D, RAPP H, et al. Real-time loop closure in 2DLIDAR SLAM[C]∥IEEE. Proceedings of the2016 IEEE International Conference on Robotics and Automation. New York: IEEE, 2016: 1271-1278.
    [2] HIMSTEDT M, FROST J, HELLBACH S, et al. Large scale place recognition in 2DLIDAR scans using geometrical landmark relations[C]∥IEEE. 2014IEEE/RSJ International Conference on Intelligent Robots and Systems. New York: IEEE, 2014: 5030-5035.
    [3] TIPALDI G D, BRAUN M, ARRAS K O. FLIRT: interest regions for 2D range data with applications to robot navigation[C]∥KHATIB O, KUMAR V, SUKHATME G. Experimental Robotics. Berlin: Springer, 2014: 695-710.
    [4] 王云峰, 翁秀玲, 吴炜, 等. 基于贪心策略的视觉SLAM闭环检测算法[J]. 天津大学学报: 自然科学与工程技术版, 2017, 50 (12): 1262-1270. https://www.cnki.com.cn/Article/CJFDTOTAL-TJDX201712007.htm

    WANG Yun-feng, WONG Xiu-ling, WU Wei, et al. Loop closure detection algorithm based on greedy strategy for visual SLAM[J]. Journal of Tianjin University: Science and Technology, 2017, 50 (12): 1262-1270. (in Chinese). https://www.cnki.com.cn/Article/CJFDTOTAL-TJDX201712007.htm
    [5] ZHU A Z, THAKUR D, ZASLAN T, et al. The multivehicle stereo event camera dataset: an event camera dataset for 3D perception[J]. Robotics and Automation Letters, 2018, 3 (3): 2032-2039. doi: 10.1109/LRA.2018.2800793
    [6] TAKETOMI T, UCHIYAMA H, IKEDA S. Visual SLAM algorithms: a survey from 2010 to 2016[J]. IPSJ Transactions on Computer Vision and Applications, 2017, 9 (1): 16-26. doi: 10.1186/s41074-017-0027-2
    [7] RUECKAUER B, DELBRUCK T. Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor[J]. Frontiers in Neuroscience, 2016, 10 (137): 1-17.
    [8] GAO Xiang, ZHANG Tao. Unsupervised learning to detect loops using deep neural networks for visual SLAM system[J]. Autonomous Robots, 2017, 41: 1-18. doi: 10.1007/s10514-015-9516-2
    [9] TORRES-GONZLEZ A, DIOS J R M, OLLERO A. Rangeonly SLAM for robot-sensor network cooperation[J]. Autonomous Robots, 2018, 42: 649-663. doi: 10.1007/s10514-017-9663-8
    [10] LENAC K, KITANOV A, CUPEC R, et al. Fast planar surface3D SLAM using LIDAR[J]. Robotics and Autonomous Systems, 2017, 92: 197-220. doi: 10.1016/j.robot.2017.03.013
    [11] ANDERT F, AMMANN N, KRAUSE S, et al. Opticalaided aircraft navigation using decoupled visual SLAM with range sensor augmentation[J]. Journal of Intelligent and Robotic Systems, 2017, 88 (2-4): 547-565. doi: 10.1007/s10846-016-0457-6
    [12] 康俊民, 赵祥模, 徐志刚. 无人车行驶环境特征分类方法[J]. 交通运输工程学报, 2016, 16 (6): 140-148. http://transport.chd.edu.cn/article/id/201606017

    KANG Jun-min, ZHAO Xiang-mo, XU Zhi-gang. Classification method of running environment features for unmanned vehicle[J]. Journal of Traffic and Transportation Engineering, 2016, 16 (6): 140-148. (in Chinese). http://transport.chd.edu.cn/article/id/201606017
    [13] ADAMS M, ZHANG Sen, XIE Li-hua. Particle filter based outdoor robot localization using natural features extracted from laser scanners[C]∥IEEE. Proceedings of the 2004IEEE International Conference on Robotics and Automation. New York: IEEE, 2004: 1493-1498.
    [14] VANDORPE J, VAN BRUSSEL H, XU H. Exact dynamic map building for a mobile robot using geometrical primitives produced by a 2Drange finder[C]∥IEEE. Proceedings of the1996 IEEE International Conference on Robotics and Automation. New York: IEEE, 1996: 901-908.
    [15] TAYLOR R M, PROBERT P J. Range finding and feature extraction by segmentation of images for mobile robot navigation[C]∥IEEE. Proceedings of the 1996 IEEE International Conference on Robotics and Automation. New York: IEEE, 1996: 95-100.
    [16] ADAMS M D, KERSTENS A. Tracking naturally occurring indoor features in 2Dand 3D with LIDAR range/amplitude data[J]. International Journal of Robotics Research, 1998, 17 (9): 907-923. doi: 10.1177/027836499801700901
    [17] GUIVANT J, NEBOT E, BAIKER S. Autonomous navigation and map building using laser range sensors in outdoor applications[J]. Journal of Robotic Systems, 2000, 17 (10): 565-583. doi: 10.1002/1097-4563(200010)17:10<565::AID-ROB4>3.0.CO;2-6
    [18] GUIVANT J, MASSON F, NEBOT E. Simultaneous localization and map building using natural features and absolute information[J]. Robotics and Autonomous Systems, 2002, 40 (2/3): 79-90.
    [19] 满增光, 叶文华, 肖海宁, 等. 从激光扫描数据中提取角点特征的方法[J]. 南京航空航天大学学报, 2012, 44 (3): 379-383. doi: 10.3969/j.issn.1005-2615.2012.03.017

    MAN Zeng-guang, YE Wen-hua, XIAO Hai-ning, et al. Method for corner feature extraction from laser scan data[J]. Journal of Nanjing University of Aeronautics and Astronautics, 2012, 44 (3): 379-383. (in Chinese). doi: 10.3969/j.issn.1005-2615.2012.03.017
    [20] FABRESSE F R, CABALLERO F, MAZA I, et al. Undelayed3DRO-SLAM based on Gaussian-mixture and reduced spherical parametrization[C]∥IEEE. 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. New York: IEEE, 2013: 1555-1561.
    [21] EROL B A, VAISHNAV S, LABRADO J D, et al. Cloudbased control and vSLAM through cooperative mapping and localization[C]∥IEEE. 2016 World Automation Congress. New York: IEEE, 2016: 1-6.
    [22] ZHANG Sen, XIE Li-hua, ADAMS M. An efficient data association approach to simultaneous localization and map building[J]. International Journal of Robotics Research, 2005, 24 (1): 49-60. doi: 10.1177/0278364904049251
    [23] 康俊民, 赵祥模, 徐志刚. 基于特征几何关系的无人车轨迹回环检测[J]. 中国公路学报, 2017, 30 (1): 121-128, 135. https://www.cnki.com.cn/Article/CJFDTOTAL-ZGGL201701015.htm

    KANG Jun-min, ZHAO Xiang-mo, XU Zhi-gang. Loop closure detection of unmanned vehicle trajectory based on geometric relationship between features[J]. China Journal of Highway and Transport, 2017, 30 (1): 121-128, 135. (in Chinese). https://www.cnki.com.cn/Article/CJFDTOTAL-ZGGL201701015.htm
    [24] 李阳铭, 宋全军, 刘海, 等. 用于移动机器人导航的通用激光雷达特征提取[J]. 华中科技大学学报: 自然科学版, 2013, 41 (增1): 280-283. https://www.cnki.com.cn/Article/CJFDTOTAL-HZLG2013S1071.htm

    LI Yang-ming, SONG Quan-jun, LIU Hai, et al. General purpose LIDAR feature extractor for mobile robot navigation[J]. Journal of Huazhong University of Science and Technology: Natural Science Edition, 2013, 41 (S1): 280-283. (in Chinese). https://www.cnki.com.cn/Article/CJFDTOTAL-HZLG2013S1071.htm
    [25] LI Yang-ming, OLSON E B. Extracting general-purpose features from LIDAR data[C]∥IEEE. Proceedings of the2010 IEEE International Conference on Robotics and Automation. New York: IEEE, 2010: .
  • 加载中
图(14)
计量
  • 文章访问数:  1006
  • HTML全文浏览量:  234
  • PDF下载量:  873
  • 被引次数: 0
出版历程
  • 收稿日期:  2018-01-05
  • 刊出日期:  2018-06-25

目录

    /

    返回文章
    返回