• 中文核心
  • EI
  • 中国科技核心
  • Scopus
  • CSCD
  • 英国科学文摘

留言板

尊敬的读者、作者、审稿人, 关于本刊的投稿、审稿、编辑和出版的任何问题, 您可以本页添加留言。我们将尽快给您答复。谢谢您的支持!

姓名
邮箱
手机号码
标题
留言内容
验证码

基于集合贝叶斯交互基元的机械臂自主肝脏超声扫查

马骥 赵悦 刘壮 胡悦 刘健行 沈毅

马骥, 赵悦, 刘壮, 胡悦, 刘健行, 沈毅. 基于集合贝叶斯交互基元的机械臂自主肝脏超声扫查. 自动化学报, xxxx, xx(x): x−xx doi: 10.16383/j.aas.c250530
引用本文: 马骥, 赵悦, 刘壮, 胡悦, 刘健行, 沈毅. 基于集合贝叶斯交互基元的机械臂自主肝脏超声扫查. 自动化学报, xxxx, xx(x): x−xx doi: 10.16383/j.aas.c250530
Ma Ji, Zhao Yue, Liu Zhuang, Hu Yue, Liu Jian-Xing, Shen Yi. Autonomous liver ultrasound scanning via robotic arm using ensemble bayesian interaction primitives. Acta Automatica Sinica, xxxx, xx(x): x−xx doi: 10.16383/j.aas.c250530
Citation: Ma Ji, Zhao Yue, Liu Zhuang, Hu Yue, Liu Jian-Xing, Shen Yi. Autonomous liver ultrasound scanning via robotic arm using ensemble bayesian interaction primitives. Acta Automatica Sinica, xxxx, xx(x): x−xx doi: 10.16383/j.aas.c250530

基于集合贝叶斯交互基元的机械臂自主肝脏超声扫查

doi: 10.16383/j.aas.c250530 cstr: 32138.14.j.aas.c250530
基金项目: 国家自然科学基金(62473108,62173116,62371167,62373127),国家资助博士后研究人员计划(GZB20250957),中国博士后科学基金(2024M764189)资助
详细信息
    作者简介:

    马骥:哈尔滨工业大学航天学院博士研究生. 主要研究方向为医疗机器人智能控制. E-mail: 23b904055@stu.hit.edu.cn

    赵悦:哈尔滨工业大学航天学院教授. 主要研究方向为医学图像处理, 医疗机器人智能控制和超声成像算法. 本文通信作者 E-mail: yue.zhao@hit.edu.cn

    刘壮:哈尔滨工业大学航天学院副研究员. 主要研究方向为自适应控制, 滑模控制和机器人系统. E-mail: zliu@hit.edu.cn

    胡悦:哈尔滨工业大学电子与信息工程学院教授. 主要研究方向为医学图像处理, 计算磁共振成像和磁共振指纹成像. E-mail: huyue@hit.edu.cn

    刘健行:哈尔滨工业大学航天学院教授. 主要研究方向为滑模控制, 非线性控制和工业电子. E-mail: jx.liu@hit.edu.cn

    沈毅:哈尔滨工业大学航天学院教授. 主要研究方向为仪器仪表与测量, 超声信号处理和现代检测技术. E-mail: shen@hit.edu.cn

Autonomous Liver Ultrasound Scanning via Robotic Arm Using Ensemble Bayesian Interaction Primitives

Funds: Supported by National Natural Science Foundation of China (62473108,62173116,62371167,62373127), Postdoctoral Fellowship Program of China Postdoctoral Science Foundation (GZB20250957), and China Postdoctoral Science Foundation (2024M764189)
More Information
    Author Bio:

    MA Ji Ph.D. candidate at the School of Astronautics, Harbin Institute of Technology. His main research interest is intelligent control of medical robots

    ZHAO Yue Professor at the School of Astronautics, Harbin Institute of Technology. Her research interests include medical image processing, intelligent control of medical robots, and ultrasound imaging algorithm. Corresponding author of this paper

    LIU Zhuang Associate research fellow at the School of Astronautics, Harbin Institute of Technology. His research interests include adaptive control, sliding mode control, and robot system

    Hu Yue Professor at the School of Electronics and Information Engineering, Harbin Institute of Technology. Her research interests include medical image processing, computational magnetic resonance imaging, and magnetic resonance fingerprinting

    LIU Jian-Xing Professor at the School of Astronautics, Harbin Institute of Technology. His research interests include sliding mode control, nonlinear control, and industrial electronics

    SHEN Yi Professor at the School of Astronautics, Harbin Institute of Technology. His research interests include instrumentation and measurement, ultrasound signal processing, and modern detection techniques

  • 摘要: 针对人体肝脏结构的超声扫查需求, 提出一种基于集合贝叶斯交互基元的全自主机械臂辅助扫查方法, 并搭建了相应的实验系统.该方法将扫查流程划分为顺序执行的“初始定位”与“模仿学习”两个阶段.在初始定位阶段, 系统通过RGB-D图像引导探头与患者建立接触, 并基于实时超声图像判断向模仿学习阶段切换的时机. 在模仿学习阶段, 系统将医师示范的扫查技能编码为超声图像与探头运动轨迹, 并通过集合贝叶斯交互基元实现对扫查技能的学习与复现, 最终完成肝脏的自主超声扫查. 最后, 在人体腹部体模上对所提方法进行了实验验证. 实验结果表明, 该方法在无需人工干预的条件下即可完成肝脏自主扫查任务, 展现出良好的临床应用前景.
  • 图  1  机械臂自主肝脏超声扫查系统

    Fig.  1  Robotic arm system for autonomous liver ultrasound scanning

    图  2  机械臂自主超声扫查流程

    Fig.  2  Workflow of autonomous ultrasound scanning via the robotic arm

    图  3  超声图像状态定义

    Fig.  3  Definition of ultrasound image states

    图  4  示教轨迹采集与潜在状态建模过程

    Fig.  4  Process of demonstrated trajectory acquisition and latent state modeling

    图  5  UVM-UNet与UNet的分割结果

    Fig.  5  Segmentation results of UVM-UNet and UNet

    图  6  S2阶段的扫查快照

    Fig.  6  Snapshots of the scanning process during the S2 stage

    图  7  不同相位下enBIP预测的探头$ z $轴位置轨迹对比

    Fig.  7  Comparison of enBIP-predicted probe $ z $-axis position trajectories at different phases

    图  8  不同观测维度下系统状态轨迹

    Fig.  8  System state trajectories under different observation dimensions

    图  9  不同观测维度下系统增广时间状态轨迹

    Fig.  9  System augmented temporal state trajectories under different observation dimensions

    图  10  完整扫查过程中系统的接触力与探头位置变化

    Fig.  10  Contact force and probe position variations during the entire scanning process

    图  11  不同体模放置位置下的自主扫查过程快照

    Fig.  11  Snapshots of the autonomous scanning process under different phantom placements

    表  1  本文方法与现有代表性研究的特性对比

    Table  1  Feature comparison between the proposed method and existing representative studies

    文献 自主初始
    定位
    显式相位
    估计
    图像观测
    反馈
    时空联合
    推断
    在线闭环
    验证
    文献[9] × × ×
    文献[10] ×$ ^* $ × ×
    文献[11] × × × ×
    本文
    $ ^* $: 文献[10]中的相位变量由时间驱动的正则系统生成, 而非基于状态观测的实时估计.
    下载: 导出CSV

    表  2  深度网络分割性能指标

    Table  2  Segmentation performance metrics of deep networks

    模型 mIoU Dice Acc Spe Sen 自主初始定位
    UVM-UNet 0.9529$ \pm $0.0041 0.9759$ \pm $0.0022 0.9935$ \pm $0.0008 0.9963$ \pm $0.0004 0.9755$ \pm $0.0027 9.09$ \pm $0.30
    UNet 0.9674$ \pm $0.0056 0.9834$ \pm $0.0029 0.9955$ \pm $0.0007 0.9973$ \pm $0.0007 0.9846$ \pm $0.0014 43.04$ \pm $0.19
    下载: 导出CSV

    表  3  不同扫查策略评价指标对比

    Table  3  Comparison of evaluation metrics of different scanning strategies

    策略$ e_{\boldsymbol{p}} $ /m$ e_{\boldsymbol{q}} $ /rad推理次数/次成功率
    A0.00240.0153244$ \pm $1210/10
    B0.00120.0091303$ \pm $610/10
    C0.00460.0295170$ \pm $1310/10
    D$ ^* $0.00410.0255175$ \pm $268/10
    $ ^* $: 该策略下的评价指标以8个样本计算
    下载: 导出CSV
  • [1] Bray F, Laversanne M, Sung H, Ferlay J, Siegel R L, Soerjomataram I, et al. Global cancer statistics 2022: GLOBOCAN estimates of incidence and mortality worldwide for 36 cancers in 185 countries. Ca-a Cancer Journal for Clinicians, 2024, 74(3): 229−263 doi: 10.3410/f.739487650.793592245
    [2] Zhou J K, Tian H Z, Wang W, Huang Q H. Fully automated thyroid ultrasound screening utilizing multi-modality image and anatomical prior. Biomedical Signal Processing and Control, 2024, 87(A): Article No. 105430 doi: 10.1016/j.bspc.2023.105430
    [3] Huang Q H, Gao B, Wang M L. Robot-assisted autonomous ultrasound imaging for carotid artery. IEEE Transactions on Instrumentation and Measurement, 2024, 73: Article No. 4003009 doi: 10.1109/tim.2024.3353836
    [4] Priester A M, Natarajan S, Culjat M O. Robotic ultrasound systems in medicine. IEEE Transactions on Ultrasonics, Ferroelectrics, and Frequency Control, 2013, 60(3): 507−523 doi: 10.1109/TUFFC.2013.2593
    [5] Huang Q H, Zhou J K, Li Z J. Review of robot-assisted medical ultrasound imaging systems: Technology and clinical applications. Neurocomputing, 2023, 559: Article No. 126790 doi: 10.1016/j.neucom.2023.126790
    [6] Ma X H, Zeng M J, Hill J C, Hoffmann B, Zhang Z M, Zhang H C K. Guiding the last centimeter: Novel anatomy-aware probe servoing for standardized imaging plane navigation in robotic lung ultrasound. IEEE Transactions on Automation Science and Engineering, 2025, 22: 6569−6580 doi: 10.1109/TASE.2024.3448241
    [7] Ning G C, Zhang X R, Liao H G. Autonomic robotic ultrasound imaging system based on reinforcement learning. IEEE Transactions on Biomedical Engineering, 2021, 68(9): 2787−2797 doi: 10.1109/TBME.2021.3054413
    [8] Luo C W, Chen Y H, Cao H Z, Sibahee M A Al, Xu W T, Zhang J. Multi-modal autonomous ultrasound scanning for efficient human-machine fusion interaction. IEEE Transactions on Automation Science and Engineering, 2025, 22: 4712−4723 doi: 10.1109/TASE.2024.3370728
    [9] Hu Y, Tavakoli M. Autonomous ultrasound scanning towards standard plane using interval interaction probabilistic movement primitives. In: Proceedings of the 2023 IEEE/RSJ International Conference on Intelligent Robots and Systems. Detroit, USA: IEEE, 2023. 3719-3727
    [10] Wang Z H, Shi D H, Yang C G, Si W Y, Li Q C. Autonomous liver ultrasound examination based on imitation learning and stiffness estimation. In: Proceedings of the 2024 IEEE International Conference on Industrial Technology. Bristol, UK: IEEE, 2024. 1-6
    [11] Deng X T, Jiang J N, Cheng W, Yang C G, Li M. Learning freehand ultrasound through multimodal representation and skill adaptation. IEEE Transactions on Automation Science and Engineering, 2025, 22: 5117−5130 doi: 10.1109/TASE.2024.3416827
    [12] Campbell J, Stepputtis S, Amor H B. Probabilistic multimodal modeling for human-robot interaction tasks. In: Proceedings of the 15th Robotics: Science and Systems. Freiburg im Breisgau, Germany, 2019. 1-9
    [13] Campbell J, Hitzmann A, Stepputtis S, Ikemoto S, Hosoda K, Amor H B. Learning interactive behaviors for musculoskeletal robots using Bayesian Interaction Primitives. In: Proceedings of the 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems. Macau, China: IEEE, 2019. 5071-5078
    [14] Clark G, Amor H B. Learning ergonomic control in human-robot symbiotic walking. IEEE Transactions on Robotics, 2023, 39(1): 327−342 doi: 10.1109/TRO.2022.3192779
    [15] Huang Q H, Lan J L, Li X L. Robotic arm based automatic ultrasound scanning for three-dimensional imaging. IEEE Transactions on Industrial Informatics, 2019, 15(2): 1173−1182 doi: 10.1109/TII.2018.2871864
    [16] Wu R K, Liu Y H, Ning G C, Liang P C, Chang Q. UltraLight VM-UNet: Parallel Vision Mamba significantly reduces parameters for skin lesion segmentation. Patterns, 2025, 6(11): Article No. 101298 doi: 10.1016/j.patter.2025.101298
    [17] Mustafa A S B, Ishii T, Matsunaga Y, Nakadate R, Ishii H, Ogawa K, et al. Development of robotic system for autonomous liver screening using ultrasound scanning device. In: Proceedings of the 2013 IEEE International Conference on Robotics and Biomimetics. Shenzhen, China: IEEE, 2013. 804-809
    [18] Huang Y L, Abu-Dakka F J, Silvério J, Caldwell D G. Toward orientation learning and adaptation in cartesian space. IEEE Transactions on Robotics, 2021, 37(1): 82−98 doi: 10.1109/TRO.2020.3010633
    [19] Zeestraten M J A, Havoutis I, Silvério J, Calinon S, Caldwell D G. An approach for imitation learning on riemannian manifolds. IEEE Robotics and Automation Letters, 2017, 2(3): 1240−1247 doi: 10.1109/LRA.2017.2657001
    [20] Wang Z W, Zhao B L, Zhang P, Yao L, Wang Q, Li B, et al. Full-coverage path planning and stable interaction control for automated robotic breast ultrasound scanning. IEEE Transactions on Industrial Electronics, 2023, 70(7): 7051−7061 doi: 10.1109/TIE.2022.3204967
    [21] Jiang Z L, Grimm M, Zhou M C, Hu Y, Esteban J, Navab N. Automatic force-based probe positioning for precise robotic ultrasound acquisition. IEEE Transactions on Industrial Electronics, 2021, 68(11): 11200−11211 doi: 10.1109/TIE.2020.3036215
    [22] 张立建, 胡瑞钦, 易旺民. 基于六维力传感器的工业机器人末端负载受力感知研究. 自动化学报, 2017, 43(3): 439−447 doi: 10.16383/j.aas.2017.c150753

    Zhang Li-Jian, Hu Rui-Qin, Yi Wang-Min. Research on force sensing for the end-load of industrial robot based on a 6-axis Force/Torque sensor. Acta Automatica Sinica, 2017, 43(3): 439−447 doi: 10.16383/j.aas.2017.c150753
    [23] Ronneberger O, Fischer P, Brox T. U-Net: convolutional networks for biomedical image segmentation. In: Proceedings of the 18th International Conference on Medical Image Computing and Computer-Assisted Intervention. Munich, Germany: Springer, 2015. 234-241
  • 加载中
计量
  • 文章访问数:  8
  • HTML全文浏览量:  5
  • 被引次数: 0
出版历程
  • 收稿日期:  2025-10-13
  • 录用日期:  2026-01-30
  • 网络出版日期:  2026-03-30

目录

    /

    返回文章
    返回