|Table of Contents|

Corner feature extraction of 2D lidar data(PDF)

《交通运输工程学报》[ISSN:1671-1637/CN:61-1369/U]

Issue:
2018年03期
Page:
228-238
Research Field:
交通信息工程及控制
Publishing date:

Info

Title:
Corner feature extraction of 2D lidar data
Author(s):
KANG Jun-min12 ZHAO Xiang-mo2 YANG Di3
1. School of Economy and Finance, Xi’an International Studies University, Xi’an 710128, Shaanxi, China; 2. School of Information Engineering, Chang’an University, Xi’an 710064, Shaanxi, China; 3. School of Business, Xi’an International Studies University, Xi’an 710128, Shaanxi, China
Keywords:
information engineering unmanned vehicle lidar simultaneous localization and mapping bivariate normal probability density feature extraction
PACS:
U491.5
DOI:
-
Abstract:
In order to enhance the robustness of the corner feature recognition in the driving environment by the unmanned vehicle and improve the recognition speed of the corner feature, based on the relative difference between bivariate normal probability density map values of observation points, a corner feature extraction method was proposed. The observation data set was mapped to the bivariate normal probability density space, and the mapping value of each observation point was obtained. The mapping results were normalized, and the numerical differences caused by the covariances were eliminated. The positions of peaks and troughs were found in the mapped numerical curve. The observation point corresponding to the peak was closest to the mean point, and the observation point corresponding to the trough was closest to the inflection point. Whether the set of observed data meets the edge length requirement of the corner features was determined by using the relative heights of peaks and troughs. The coordinates of the original observation data points corresponding to the troughs were used as corner features to construct the environment feature map. Test result shows that the extraction method can process sparse observation data with more than 63 observation points and angular resolution of the observation point greater than 1°. Therefore, in large-scale outdoor environment and indoor environment, the extraction method can stably identify large corner points. When the observation data points are less than 180, the maximum processing time is less than 5 ms, and the average processing time is less than 1.9 ms, so the extraction method has good real-time performance, which is conducive for decreasing the time required for designing the environment feature map. The extraction method extracts the corner features according to the bivariate normal probability density of the observation data, is insensitive to the observation error and the scale and shape of the corner feature, and can effectively improve the robustness of corner feature recognition. 14 figs, 25 refs.

References:


[1] HESS W, KOHLER D, RAPP H, et al. Real-time loop closure in 2D LIDAR SLAM[C]∥IEEE. Proceedings of the 2016 IEEE International Conference on Robotics and Automation. New York: IEEE, 2016: 1271-1278.
[2] HIMSTEDT M, FROST J, HELLBACH S, et al. Large scale place recognition in 2D LIDAR scans using geometrical landmark relations[C]∥IEEE. 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems. New York: IEEE, 2014: 5030-5035.
[3] TIPALDI G D, BRAUN M, ARRAS K O. FLIRT: interest regions for 2D range data with applications to robot navigation[C]∥KHATIB O, KUMAR V, SUKHATME G. Experimental Robotics. Berlin: Springer, 2014: 695-710.
[4] 王云峰,翁秀玲,吴 炜,等.基于贪心策略的视觉SLAM闭环检测算法[J].天津大学学报:自然科学与工程技术版,2017,50(12):1262-1270. WANG Yun-feng, WONG Xiu-ling, WU Wei, et al. Loop closure detection algorithm based on greedy strategy for visual SLAM[J]. Journal of Tianjin University: Science and Technology, 2017, 50(12): 1262-1270.(in Chinese)
[5] ZHU A Z, THAKUR D, ÖZASLAN T, et al. The multivehicle stereo event camera dataset: an event camera dataset for 3D perception[J]. Robotics and Automation Letters, 2018, 3(3): 2032-2039.
[6] TAKETOMI T, UCHIYAMA H, IKEDA S. Visual SLAM algorithms: a survey from 2010 to 2016[J]. IPSJ Transactions on Computer Vision and Applications, 2017, 9(1): 16-26.
[7] RUECKAUER B, DELBRUCK T. Evaluation of event-based algorithms for optical flow with ground-truth from inertial measurement sensor[J]. Frontiers in Neuroscience, 2016, 10(137): 1-17.
[8] GAO Xiang, ZHANG Tao. Unsupervised learning to detect loops using deep neural networks for visual SLAM system[J]. Autonomous Robots, 2017, 41: 1-18.
[9] TORRES-GONZÁLEZ A, DIOS J R M, OLLERO A. Range-only SLAM for robot-sensor network cooperation[J]. Autonomous Robots, 2018, 42: 649-663.
[10] LENAC K, KITANOV A, CUPEC R, et al. Fast planar surface 3D SLAM using LIDAR[J]. Robotics and Autonomous Systems, 2017, 92: 197-220.
[11] ANDERT F, AMMANN N, KRAUSE S, et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation[J]. Journal of Intelligent and Robotic Systems, 2017, 88(2-4): 547-565.
[12] 康俊民,赵祥模,徐志刚.无人车行驶环境特征分类方法[J].交通运输工程学报,2016,16(6):140-148. KANG Jun-min, ZHAO Xiang-mo, XU Zhi-gang. Classification method of running environment features for unmanned vehicle[J]. Journal of Traffic and Transportation Engineering, 2016, 16(6): 140-148.(in Chinese)
[13] ADAMS M, ZHANG Sen, XIE Li-hua. Particle filter based outdoor robot localization using natural features extracted from laser scanners[C]∥IEEE. Proceedings of the 2004 IEEE International Conference on Robotics and Automation. New York: IEEE, 2004: 1493-1498.
[14] VANDORPE J, VAN BRUSSEL H, XU H. Exact dynamic map building for a mobile robot using geometrical primitives produced by a 2D range finder[C]∥IEEE. Proceedings of the 1996 IEEE International Conference on Robotics and Automation. New York: IEEE, 1996: 901-908.
[15] TAYLOR R M, PROBERT P J. Range finding and feature extraction by segmentation of images for mobile robot navigation[C]∥IEEE. Proceedings of the 1996 IEEE International Conference on Robotics and Automation. New York: IEEE, 1996: 95-100.
[16] ADAMS M D, KERSTENS A. Tracking naturally occurring indoor features in 2D and 3D with LIDAR range/amplitude data[J]. International Journal of Robotics Research, 1998, 17(9): 907-923.
[17] GUIVANT J, NEBOT E, BAIKER S. Autonomous navigation and map building using laser range sensors in outdoor applications[J]. Journal of Robotic Systems, 2000, 17(10): 565-583.
[18] GUIVANT J, MASSON F, NEBOT E. Simultaneous localization and map building using natural features and absolute information[J]. Robotics and Autonomous Systems, 2002, 40(2/3): 79-90.
[19] 满增光,叶文华,肖海宁,等.从激光扫描数据中提取角点特征的方法[J].南京航空航天大学学报,2012,44(3):379-383. MAN Zeng-guang, YE Wen-hua, XIAO Hai-ning, et al. Method for corner feature extraction from laser scan data[J]. Journal of Nanjing University of Aeronautics and Astronautics, 2012, 44(3): 379-383.(in Chinese)
[20] FABRESSE F R, CABALLERO F, MAZA I, et al. Undelayed 3D RO-SLAM based on Gaussian-mixture and reduced spherical parametrization[C]∥IEEE. 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems. New York: IEEE, 2013: 1555-1561.
[21] EROL B A, VAISHNAV S, LABRADO J D, et al. Cloud-based control and vSLAM through cooperative mapping and localization[C]∥IEEE. 2016 World Automation Congress. New York: IEEE, 2016: 1-6.
[22] ZHANG Sen, XIE Li-hua, ADAMS M. An efficient data association approach to simultaneous localization and map building[J]. International Journal of Robotics Research, 2005, 24(1): 49-60.
[23] 康俊民,赵祥模,徐志刚.基于特征几何关系的无人车轨迹回环检测[J].中国公路学报,2017,30(1):121-128,135. KANG Jun-min, ZHAO Xiang-mo, XU Zhi-gang. Loop closure detection of unmanned vehicle trajectory based on geometric relationship between features[J]. China Journal of Highway and Transport, 2017, 30(1): 121-128, 135.(in Chinese)
[24] 李阳铭,宋全军,刘 海,等.用于移动机器人导航的通用激光雷达特征提取[J].华中科技大学学报:自然科学版,2013,41(增1):280-283. LI Yang-ming, SONG Quan-jun, LIU Hai, et al. General purpose LIDAR feature extractor for mobile robot navigation[J]. Journal of Huazhong University of Science and Technology: Natural Science Edition, 2013, 41(S1): 280-283.(in Chinese)
[25] LI Yang-ming, OLSON E B. Extracting general-purpose features from LIDAR data[C]∥IEEE. Proceedings of the 2010 IEEE International Conference on Robotics and Automation. New York: IEEE, 2010: 1388-1393.

Memo

Memo:
-
Last Update: 2018-07-14