留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

无人机室内视觉/惯导组合导航方法

王亭亭 蔡志浩 王英勋

王亭亭, 蔡志浩, 王英勋等 . 无人机室内视觉/惯导组合导航方法[J]. 北京航空航天大学学报, 2018, 44(1): 176-186. doi: 10.13700/j.bh.1001-5965.2016.0965
引用本文: 王亭亭, 蔡志浩, 王英勋等 . 无人机室内视觉/惯导组合导航方法[J]. 北京航空航天大学学报, 2018, 44(1): 176-186. doi: 10.13700/j.bh.1001-5965.2016.0965
WANG Tingting, CAI Zhihao, WANG Yingxunet al. Integrated vision/inertial navigation method of UAVs in indoor environment[J]. Journal of Beijing University of Aeronautics and Astronautics, 2018, 44(1): 176-186. doi: 10.13700/j.bh.1001-5965.2016.0965(in Chinese)
Citation: WANG Tingting, CAI Zhihao, WANG Yingxunet al. Integrated vision/inertial navigation method of UAVs in indoor environment[J]. Journal of Beijing University of Aeronautics and Astronautics, 2018, 44(1): 176-186. doi: 10.13700/j.bh.1001-5965.2016.0965(in Chinese)

无人机室内视觉/惯导组合导航方法

doi: 10.13700/j.bh.1001-5965.2016.0965
基金项目: 

航空科学基金 20135851043

详细信息
    作者简介:

    王亭亭  女, 硕士研究生。主要研究方向:机器视觉、无人机视觉导航

    蔡志浩  男, 副教授, 硕士生导师。主要研究方向:无人机自主控制与导航、多机协同与训练

    王英勋  男, 研究员, 博士生导师。主要研究方向:系统设计、自主控制与模拟训练

    通讯作者:

    蔡志浩, E-mail: czh@buaa.edu.cn

  • 中图分类号: TP242.6;V249.32

Integrated vision/inertial navigation method of UAVs in indoor environment

Funds: 

Aeronautical Science Foundation of China 20135851043

More Information
  • 摘要:

    针对室内无卫星定位下的无人机自主导航问题,提出了一种融合惯导、光流和视觉里程计的组合导航方法。在速度估计上,采用基于ORB特征的光流法,该方法可以实时地估计出无人机的三轴线速度信息。方法采用基于特征点的稀疏光流,对金字塔Lucas-Kanade光流法进行了改进,采用前后双向追踪和随机采样一致的方法提高特征点追踪精度。在位置估计上,采用视觉/惯导融合的视觉里程计,以人工图标法为主,融合视觉光流信息和惯导数据实现无人机定位。通过与运动捕捉系统的定位信息、Guidance和PX4Flow导航模块的测速信息进行对比,以及实际的飞行测试,验证本文方法的可行性。

     

  • 图 1  前后双向追踪示意图

    Figure 1.  Schematic of forward-backward tracking

    图 2  采用RANSAC算法对特征点对滤波

    Figure 2.  Filtering features by RANSAC algorithm

    图 3  相机运动时空间点成像变化

    Figure 3.  Point-mapping changing caused by moving camera

    图 4  姿态变化时测速对比

    Figure 4.  Velocity comparison with changing attitude

    图 5  图标法里程计算法基本流程

    Figure 5.  Algorithm flowchart of icon odometry

    图 6  图像滤波与边缘提取

    Figure 6.  Image filtering and edge extraction

    图 7  图标法里程计定位

    Figure 7.  Positioning based on icon odometry

    图 8  定位测试

    Figure 8.  Localization test

    图 9  三维位置估计与位置误差

    Figure 9.  Localization estimation and positioning error in 3D-directions

    图 10  三维/二维运动轨迹

    Figure 10.  Three/two-dimensional motion trajectory

    图 11  三维速度估计与速度误差

    Figure 11.  Velocity estimation and velocity error in 3D-directions

    图 12  无人机平台及系统结构

    Figure 12.  UAV platform and system structure

    图 13  飞行视频截图

    Figure 13.  Screenshots of flight video

    图 14  手控飞行速度测试

    Figure 14.  Flight velocity test via manual control

    图 15  位置估计

    Figure 15.  Position estimation

    图 16  速度估计与飞机姿态

    Figure 16.  Velocity estimation and aircraft attitude

    表  1  SIFT、SURF和ORB特征提取时间开销对比

    Table  1.   Comparison of time consumption of feature extraction among SIFT, SURF and ORB

    方法 特征点数 时间/ms
    SIFT[14] 171 18.89
    253 18.99
    234 19.66
    SURF[15] 86 12.44
    254 16.21
    187 13.86
    ORB 168 2.82
    299 4.11
    251 4.7
    下载: 导出CSV
  • [1] SHEN S J.Autonomous navigation in complex indoor and outdoor environments with micro aerial vehicles[D].Philadelphia:University of Pennsylvania, 2014.
    [2] 吴琦, 蔡志浩, 王英勋.用于无人机室内导航的光流与地标融合方法[J].控制理论与应用, 2015, 32(11):1511-1517. http://www.oalib.com/paper/4746150

    WU Q, CAI Z H, WANG Y X.Optical flow and landmark fusion method for UAV indoor navization[J].Control Theory & Applications, 2015, 32(11):1511-1517(in Chinese). http://www.oalib.com/paper/4746150
    [3] LI P, LAMBERT A.A monocular odometer for a quadrotor using a homogra-phy model and inertial cues[C]//IEEE Conference on Robotics and Biomimetics.Piscataway, NJ:IEEE Press, 2015:570-575.
    [4] 叶长春. IARC第7代任务中定位与目标跟踪方法研究[D]. 杭州: 浙江大学, 2016.

    YE C C.Research on localization and object tracking for the IARC mission7[D].Hangzhou:Zhejiang University, 2016(in Chinese).
    [5] MUR-ARTAL R, MONTIEL J M M, TARDOS J D.ORB-SLAM:A versatile and accurate monocular SLAM system[J].IEEE Transactions on Robotics, 2015, 31(5):1147-1163. doi: 10.1109/TRO.2015.2463671
    [6] LEUTENEGGER S, FURGALE P, RABAUD V, et al.Keyframe-based visual-inertial SLAM using nonlinear optimization[C]//Robotics:Science and Systems, 2013:789-795.
    [7] CHAO H, GU Y, GROSS J, et al.A comparative study of optical flow and traditional sensors in UAV navigation[C]//American Control Conference(ACC).Piscataway, NJ:IEEE Press, 2013:3858-3863.
    [8] MAMMARELLA M, CAMPA G, FRAVOLINI M L, et al.Comparing optical flow algorithms using 6-dof motion of real-world rigid objects[J].IEEE Transactions on, Systems, Man, and Cybernetics, Part C:Applications and Reviews, 2012, 42(6):1752-1762. doi: 10.1109/TSMCC.2012.2218806
    [9] DJI Innovations.PHANTOM 4 user's manual V1.2[EB/OL].(2016-12-23)https://dl.djicdn.com/downloads/phantom_4/cn/Phantom_4_User_Manual_cn_v1.2_160328.pdf.
    [10] Hover Camera 2016[EB/OL].(2016-12-23)http://gethover.com.
    [11] RUBLEE E, RABAUD V, KONOLIGE K, et al.ORB:An efficient alternative to SIFT or SURF[J].Proceedings, 2011, 58(11):2564-2571. https://www.willowgarage.com/sites/default/files/orb_final.pdf
    [12] ROSTEN E, DRUMMOND T.Machine learning for high-speed corner detection[C]//European Conference on Computer Vision.Berlin:Springer-Verlag, 2006:430-443.
    [13] CALONDER M, LEPETIT V, STRECHA C, et al.BRIEF:Binary robust independent elementary features[C]//European Conference on Computer Vision.Berlin:Springer-Verlag, 2010:778-792.
    [14] LOWE D G.Distinctive image features from scale-invariant keypoints[J].International Journal of Computer Vision, 2004, 60(2):91-110. doi: 10.1023/B:VISI.0000029664.99615.94
    [15] BAY H, TUYTELAARS T, GOOL L V.SURF:Speeded up robust features[J].Computer Vision & Image Understanding, 2006, 110(3):404-417. http://www.vision.ee.ethz.ch/~surf/eccv06.pdf
    [16] CHUM O, MATAS J, KITTLER J.Locally optimized RANSAC[J].Lecture Notes in Computer Science, 2003, 2781:236-243. doi: 10.1007/b12010
    [17] HONEGGER D, MEIER L, TANSKANEN P, et al.An open source and open hardware embedded metric optical flow cmos camera for indoor and outdoor applications[C]//International Conference on Robotics and Automation.Piscataway, NJ:IEEE Press, 2013:1736-1741.
    [18] HARTLEY R I, ZISSERMAN A.Multi-view geometry in computer vision[M].Cambrige:Cambridge University Press, 2004:239-247.
  • 加载中
图(16) / 表(1)
计量
  • 文章访问数:  897
  • HTML全文浏览量:  7
  • PDF下载量:  840
  • 被引次数: 0
出版历程
  • 收稿日期:  2016-12-23
  • 录用日期:  2017-02-06
  • 刊出日期:  2018-01-20

目录

    /

    返回文章
    返回
    常见问答