留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

融合毫米波与激光雷达的障碍物检测与跟踪方法

牛国臣 田一博 熊渝

牛国臣,田一博,熊渝. 融合毫米波与激光雷达的障碍物检测与跟踪方法[J]. 北京航空航天大学学报,2024,50(5):1481-1490 doi: 10.13700/j.bh.1001-5965.2022.0541
引用本文: 牛国臣,田一博,熊渝. 融合毫米波与激光雷达的障碍物检测与跟踪方法[J]. 北京航空航天大学学报,2024,50(5):1481-1490 doi: 10.13700/j.bh.1001-5965.2022.0541
NIU G C,TIAN Y B,XIONG Y. Obstacle detection and tracking method based on millimeter wave radar and LiDAR[J]. Journal of Beijing University of Aeronautics and Astronautics,2024,50(5):1481-1490 (in Chinese) doi: 10.13700/j.bh.1001-5965.2022.0541
Citation: NIU G C,TIAN Y B,XIONG Y. Obstacle detection and tracking method based on millimeter wave radar and LiDAR[J]. Journal of Beijing University of Aeronautics and Astronautics,2024,50(5):1481-1490 (in Chinese) doi: 10.13700/j.bh.1001-5965.2022.0541

融合毫米波与激光雷达的障碍物检测与跟踪方法

doi: 10.13700/j.bh.1001-5965.2022.0541
基金项目: 天津市科技计划(17ZXHLGX00120); 天津市研究生科研创新项目(2021YJSO2S30); 中央高校基本科研业务费专项资金(3122022PY17); 中国民航大学研究生科研创新项目(2021YJS025)
详细信息
    通讯作者:

    E-mail:niu_guochen@139.com

  • 中图分类号: TN958;TP242.6

Obstacle detection and tracking method based on millimeter wave radar and LiDAR

Funds: Tianjin Science and Technology Plan (17ZXHLGX00120); Tianjin Research Innovation Project for Postgraduate Students (2021YJSO2S30); The Fundamental Research Funds for the Central Universities (3122022PY17); Civil Aviation University of China Graduate Research Innovation Project (2021YJS025)
More Information
  • 摘要:

    在园区环境中,无人车装载单一毫米波雷达或激光雷达传感器进行障碍物检测跟踪时存在探测范围有限、准确率低及稳定性差等问题。为此,提出一种基于毫米波雷达与激光雷达融合的多障碍物检测跟踪方法。利用改进欧氏聚类算法对道路内激光点云目标进行提取,基于信息筛选策略获得毫米波雷达数据中的有效目标;基于目标检测交并比与可靠性分析,对2种目标进行自适应融合,并利用跟踪门与联合概率数据关联(JPDA)算法完成前后帧数据匹配;应用多运动模型交互与无迹卡尔曼滤波实现障碍物跟踪。实车实验表明:相比单一毫米波雷达与激光雷达障碍物检测跟踪,所提方法有更好的准确性与稳定性。

     

  • 图 1  基于距离特性的可靠度分析

    Figure 1.  Reliability analysis based on distance characteristic

    图 2  系统框架

    Figure 2.  System frame

    图 3  拟合道路边界线

    Figure 3.  Fitting road boundary line

    图 4  最优边界框拟合

    Figure 4.  Optimal bounding box fitting

    图 5  改进前后对比

    Figure 5.  Comparison before and after improvement

    图 6  道路区域内有效信息保留

    Figure 6.  Effective information retention in road area

    图 7  坐标系定义

    Figure 7.  Definition of coordinate system

    图 8  传感器感知区域分类

    Figure 8.  Sensor sensing region classification

    图 9  跟踪状态属性

    Figure 9.  Tracking state attributes

    图 10  实验平台

    Figure 10.  The experimental platform

    图 11  第1类感知区域的障碍物跟踪

    Figure 11.  Obstacle tracking in category 1 sensing area

    图 12  第2类感知区域的障碍物跟踪

    Figure 12.  Obstacle tracking in category 2 sensing area

    图 13  远距离障碍物跟踪

    Figure 13.  Long-range obstacle tracking

    图 14  未进行跟踪状态属性处理

    Figure 14.  The tracking state attribute is not processed

    图 15  进行跟踪状态属性处理后结果

    Figure 15.  Results after tracking state attribute processing

    图 16  跟踪行人的效果对比

    Figure 16.  Effect comparison of pedestrian tracking

    图 17  跟踪车辆的效果对比

    Figure 17.  Effect comparison of tracking vehicles

    图 18  算法运行时间

    Figure 18.  Running time of algorithm

    表  1  不同传感器跟踪结果对比

    Table  1.   Comparison of tracking results of different sensors

    处理方法 $ {N_{{\text{MOTA}}}} $/% $ {N_{{\text{IDS}}}} $ $ {N_{{\text{FRAG}}}} $
    激光雷达 86.69 14 20
    毫米波雷达 74.03 21 36
    融合策略 93.58 11 19
    下载: 导出CSV
  • [1] LI Y, IBANEZ-GUZMAN J. Lidar for autonomous driving: The principles, challenges, and trends for automotive lidar and perception systems[J]. IEEE Signal Processing Magazine, 2020, 37(4): 50-61. doi: 10.1109/MSP.2020.2973615
    [2] 郭晓旻, 李必军, 龙江云, 等. 利用激光点云的城市无人驾驶路径规划算法[J]. 中国公路学报, 2020, 33(4): 182-190.

    GUO X M, LI B J, LONG J Y, et al. Path planning of urban autonomous driving using laser point cloud data[J]. China Journal of Highway and Transport, 2020, 33(4): 182-190(in Chinese).
    [3] LIM W, LEE S, SUNWOO M, et al. Hierarchical trajectory planning of an autonomous car based on the integration of a sampling and an optimization method[J]. IEEE Transactions on Intelligent Transportation Systems, 2018, 19(2): 613-626. doi: 10.1109/TITS.2017.2756099
    [4] CHEN J F, WANG C C, CHOU C F. Multiple target tracking in occlusion area with interacting object models in urban environments[J]. Robotics and Autonomous Systems, 2018, 103: 68-82. doi: 10.1016/j.robot.2018.02.004
    [5] 蒲良, 张学军. 基于深度学习的无人机视觉目标检测与跟踪[J]. 北京航空航天大学学报, 2022, 48(5): 872-880.

    PU L, ZHANG X J. Deep learning based UAV vision object detection and tracking[J]. Journal of Beijing University of Aeronautics and Astronautics, 2022, 48(5): 872-880(in Chinese).
    [6] FENG D. Deep multi-modal object detection and semantic segmentation for autonomous driving: Datasets, methods, and challenges[J]. IEEE Transactions on Intelligent Transportation Systems, 2021, 22(3): 1341-1360. doi: 10.1109/TITS.2020.2972974
    [7] 李炯, 赵凯, 张志超, 等. 一种融合密度聚类与区域生长算法的快速障碍物检测方法[J]. 机器人, 2020, 42(1): 60-70.

    LI J, ZHAO K, ZHANG Z C, et, al. A fast obstacle detection method by fusion of density-based clustering and region growing algorithms[J]. Robot, 2020, 42(1): 60-70(in Chinese).
    [8] XU S, WANG R, WANG H, et al. An optimal hierarchical clustering approach to mobile LiDAR point clouds[J]. IEEE Transactions on Intelligent Transportation Systems, 2020, 21(7): 2765-2776. doi: 10.1109/TITS.2019.2912455
    [9] RAVINDRAN R, SANTORA M J, JAMALI M J, et al. Multi-object detection and tracking, based on DNN, for autonomous vehicles: A review[J]. IEEE Sensors Journal, 2021, 21(5): 5668-5677. doi: 10.1109/JSEN.2020.3041615
    [10] KIM J, CHOI Y, PARK M, et al. Multi-sensor-based detection and tracking of moving objects for relative position estimation in autonomous driving conditions[J]. Journal of Supercomputing, 2020, 76(10): 8225-8247.
    [11] ESKANDARIAN A, WU C, SUN C. Research advances and challenges of autonomous and connected ground vehicles[J]. IEEE Transactions on Intelligent Transportation Systems, 2021, 22(2): 683-711. doi: 10.1109/TITS.2019.2958352
    [12] 赵万里. 基于雷达的智能车多目标检测与跟踪技术研究[D]. 长沙: 中南大学, 2011.

    ZHAO W L. Technology research of the multi-objective detection and tracking for intelligent vehicle based on radars[D]. Changsha: Central South University, 2011(in Chinese).
    [13] LIAN H, PEI X, GUO X. A local environment model based on multi-sensor perception for intelligent vehicles[J]. IEEE Sensors Journal, 2021, 21(14): 15427-15436. doi: 10.1109/JSEN.2020.3018319
    [14] HAJRI H, RAHAL M C. Real time lidar and radar high-level fusion for obstacle detection and tracking with evaluation on a ground truth[EB/OL]. (2019-07-01)[2022-05-25]. https://arxiv.org/abs/1807.11264.
    [15] GOHRING D, WANG M, SCHNURMACHER M, et al. Radar/lidar sensor fusion for car following on highways[C]//Proceedings of the IEEE International Conference on Automation, Robotics and Applications. Piscataway: IEEE Press, 2011: 407-412.
    [16] CHO H, SEO Y W, KUMAR B V K V, et al. A multi-sensor fusion system for moving object detection and tracking in urban driving environments[C]//Proceedings of the IEEE International Conference on Robotics and Automation. Piscataway: IEEE Press, 2014: 1836-1843.
    [17] FARAG W. Kalman-filter-based sensor fusion applied to road-objects detection and tracking for autonomous vehicles[J]. Proceedings of the Institution of Mechanical Engineers, 2021, 235(7): 1128-1138.
    [18] ZERMAS D, IZZAT I, PAPANIKOLOPOULOS N. Fast segmentation of 3D point clouds: A paradigm on LiDAR data for autonomous vehicle applications[C]//Proceedings of the IEEE International Conference on Robotics and Automation. Piscataway: IEEE Press, 2017: 5067-5073.
  • 加载中
图(18) / 表(1)
计量
  • 文章访问数:  306
  • HTML全文浏览量:  104
  • PDF下载量:  22
  • 被引次数: 0
出版历程
  • 收稿日期:  2022-06-29
  • 录用日期:  2022-07-22
  • 网络出版日期:  2022-12-15
  • 整期出版日期:  2024-05-29

目录

    /

    返回文章
    返回
    常见问答