2.845

2023影响因子

(CJCR)

  • 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

昆虫视觉启发的光流复合导航方法

潘超 刘建国 李峻林

潘超, 刘建国, 李峻林. 昆虫视觉启发的光流复合导航方法. 自动化学报, 2015, 41(6): 1102-1112. doi: 10.16383/j.aas.2015.c120936
引用本文: 潘超, 刘建国, 李峻林. 昆虫视觉启发的光流复合导航方法. 自动化学报, 2015, 41(6): 1102-1112. doi: 10.16383/j.aas.2015.c120936
PAN Chao, LIU Jian-Guo, LI Jun-Lin. An Optical Flow-based Composite Navigation Method Inspired by Insect Vision. ACTA AUTOMATICA SINICA, 2015, 41(6): 1102-1112. doi: 10.16383/j.aas.2015.c120936
Citation: PAN Chao, LIU Jian-Guo, LI Jun-Lin. An Optical Flow-based Composite Navigation Method Inspired by Insect Vision. ACTA AUTOMATICA SINICA, 2015, 41(6): 1102-1112. doi: 10.16383/j.aas.2015.c120936

昆虫视觉启发的光流复合导航方法

doi: 10.16383/j.aas.2015.c120936
详细信息
    作者简介:

    刘建国 博士, 华中科技大学多谱信息处理技术国家级重点实验室教授. 主要研究方向为图像和信号处理, 模式识别,并行算法和结构.E-mail: jgliu@ieee.org

    通讯作者:

    潘超 博士, 武汉数字工程研究所工程师. 主要研究方向为图像和信号处理, 视觉导航, 模式识别, 并行算法和结构. E-mail: panchaowuhan@163.com

An Optical Flow-based Composite Navigation Method Inspired by Insect Vision

  • 摘要: 昆虫能够使用视觉感受的光流(Optical flow, OF)信息执行导航任务. 启发于昆虫的视觉导航, 本文提出了一种生物视觉启发的光流复合导航方法, 它由光流导航和光流辅助导航两部分组成, 以实现高效精确的视觉导航定位. 该方法中, 光流导航的作用是使用昆虫视觉启发的光流法, 测量系统每一时刻的运动位移, 然后使用路径积分累加位移得到位置信息; 光流辅助导航的作用是针对路径积分会产生累积误差的缺点, 使用光流匹配的方法来估计和修正导航中的位置误差. 该光流辅助导航也参考了昆虫启发的光流法, 通过基于光流的卡尔曼滤波器来执行实际和预测光流的迭代匹配. 由于光流导航和光流辅助导航中的光流计算来源于同一昆虫启发光流法, 使得光流复合导航的两部分可共享输入信号和部分执行过程. 文中使用移动机器人进行导航实验,证明了该复合导航方法的效率.
  • [1] Srinivasan M V. Going with the flow: a brief history of the study of the honeybee's navigational 'odometer'. Journal of Comparative Physiology A, 2014, 200(6): 563-573
    [2] [2] Wittlinger M, Wolf H. Homing distance in desert ants, Cataglyphis fortis, remains unaffected by disturbance of walking behaviour and visual input. Journal of Physiology-Paris, 2013, 107(1-2): 130-136
    [3] [3] Chao H Y, Gu Y, Napolitano M. A survey of optical flow techniques for robotics navigation applications. Journal of Intelligent Robotic Systems, 2014, 73(1-4): 361-372
    [4] [4] Srinivasan M V. An image-interpolation technique for the computation of optic flow and egomotion. Biological Cybernetics, 1994, 71(5): 401-415
    [5] [5] Borst A. Correlation versus gradient type motion detectors: the pros and cons. Philosophical Transactions B, 2007, 362(8): 369-374
    [6] [6] Franceschini N. Small brains, smart machines: from fly vision to robot vision and back again. Proceedings of the IEEE, 2014, 102(5): 751-781
    [7] Liu Xiao-Ming, Chen Wan-Chun, Xing Xiao-Lan, Yin Xing-Liang. Optical flow/INS multi-sensor information fusion. Journal of Beijing University of Aeronautics and Astronautics, 2012, 38(5): 620-624 (刘小明, 陈万春, 邢晓岚, 殷兴良. 光流/惯导多传感器信息融合方法. 北京航空航天大学学报, 2012, 38(5): 620-624)
    [8] Cao Ju, Tao Wen-Bing, Tian Jin-Wen. Estimating target's distance using multiaperture optical imaging sensors. Instrument Technique and Sensor, 2004, (3): 50-52 (曹炬, 陶文兵, 田金文. 运用多个光学成像探测器估算目标距离. 仪表技术与传感器, 2004, (3): 50-52)
    [9] [9] Cheng K, Freas C A. Path integration, views, search, and matched filters: the contributions of Rdiger Wehner to the study of orientation and navigation. Journal of Comparative Physiology A, to be published
    [10] Hostetler L, Andreas R. Nonlinear Kalman filtering techniques for terrain-aided navigation. IEEE Transactions on Automatic Control, 1983, 28(3): 315-323
    [11] Li Shi-Dan, Sun Li-Guo, Li Xin, Wang De-Sheng. Terrain aided navigation using the RB-GSPF algorithm. Journal of Tsinghua University (Science and Technology), 2012, 52(1): 108-112 (李世丹, 孙立国, 李欣, 王德生. 基于RB-GSPF算法的地形辅助导航. 清华大学学报(自然科学版), 2012, 52(1): 108-112)
    [12] Leng Xue-Fei, Liu Jian-Ye, Xiong Zhi. A real-time image matching algorithm for navigation system based on bifurcation extraction. Acta Automatica Sinica, 2007, 33(7): 678-682 (冷雪飞, 刘建业, 熊智. 基于分支特征点的导航用实时图像匹配算法. 自动化学报, 2007, 33(7): 678-682)
    [13] Cheung A, Zhang S W, Stricker C, Srinivasan M V. Animal navigation: general properties of directed walks. Biological Cybernetics, 2008, 99(3): 197-217
    [14] Chittka L, Geiger K, Kunze J. The influences of landmarks on distance estimation of honey bees. Animal Behaviour, 1995, 50(1): 23-31
    [15] Mertes M, Dittmar L, Egelhaaf M, Boeddeker N. Visual motion-sensitive neurons in the bumblebee brain convey information about landmarks during a navigational task. Frontiers in Behavioral Neuroscience, 2014, 8: 335
    [16] Cartwright B A, Collett T S. Landmark maps for honeybees. Biological Cybernetics, 1987, 57(1-2): 85-93
    [17] Lambrinos D, Maris M, Kobayashi H, Labhart T, Pfeifer R, Wehner R. An autonomous agent navigating with a polarized light compass. Adaptive Behavior, 1997, 6(1): 131-161
    [18] Hafner V V, Mller R. Learning of visual navigation strategies. In: Proceedings of the 2001 European Workshop on Learning Robots (EWLR-9), 2001. 47-56
    [19] Xu Li-Zhong, Li Min, Shi Ai-Ye, Tang Min, Huang Feng-Chen. Feature detector model for multi-spectral remote sensing image inspired by insect visual system. Acta Electronica Sinica, 2011, 39(11): 2497-2501 (徐立中, 李敏, 石爱业, 汤敏, 黄凤辰. 受昆虫视觉启发的多光谱遥感影像特征检测器模型. 电子学报, 2011, 39(11): 2497-2501)
    [20] Pan C, Deng H, Yin X F, Liu J G. An optical flow-based integrated navigation system inspired by insect vision. Biological Cybernetics, 2011, 105(3-4): 239-252
    [21] Nagle M G, Srinivasan M V, Wilson D L. Image interpolation technique for measurement of egomotion in 6 degrees of freedom. Journal of Optical Society of America A, 1997, 14(12): 3233-3241
    [22] Sun D Q, Roth S, Black M J. Secrets of optical flow estimation and their principles. In: Proceedings of the 2010 IEEE Conference on Computer Vision and Pattern Recognition. San Francisco, CA: IEEE, 2010. 2432-2439
    [23] Wang Z H, Deng C Z, Pan C, Liu J G. A visual integrated navigation for precise position estimation. Computers and Electrical Engineering, 2014, 40(3): 897-906
    [24] Boeddeker N, Hemmi J M. Visual gaze control during peering flight manoeuvres in honeybees. Proceedings Biological Sciences, 2010, 277(1685): 1209-1217
    [25] Cai Guo-Rong, Li Shao-Zi, Wu Yun-Dong, Su Song-Zhi, Chen Shui-Li. A perspective invariant image matching algorithm. Acta Automatica Sinica, 2013, 39(7): 1053-1061 (蔡国榕, 李绍滋, 吴云东, 苏松志, 陈水利. 一种透视不变的图像匹配算法. 自动化学报, 2013, 39(7): 1053-1061)
    [26] Cao G, Yang X, Chen S S. Robust matching area selection for terrain matching using level set method. In: Proceedings of the 2nd International Conference, Lecture Notes in Computer Science, 3656. Toronto, Canada: Springer, 2005. 423-430
    [27] Panahandeh G, Jansson M. Vision-aided inertial navigation based on ground plane feature detection. IEEE/ASME Transactions on Mechatronics, 2014, 19(4): 1206-1215
  • 加载中
计量
  • 文章访问数:  2108
  • HTML全文浏览量:  68
  • PDF下载量:  1370
  • 被引次数: 0
出版历程
  • 收稿日期:  2012-10-10
  • 修回日期:  2015-02-09
  • 刊出日期:  2015-06-20

目录

    /

    返回文章
    返回