基于双向融合纹理和深度信息的目标位姿检测方法
DOI:
作者:
作者单位:

上海理工大学 光电信息与计算机工程学院

作者简介:

通讯作者:

基金项目:


Target Position Detection Method Based on Bidirectional Fusion of Texture and Depth Information
Author:
Affiliation:

School of Optical-Electrical and Computer Engineering,University of Shanghai for Science and Technology

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
    摘要:

    针对在硬件设备资源有限的情况下,深度相机在非结构化场景如何获取物体精确的位姿信息问题,这里提出一种基于双向融合纹理和深度信息的目标位姿检测方法。在学习阶段,两个网络采用全流双向融合(FFB6D)模块,纹理信息提取部分引入轻量的 Ghost 模块,减少了网络的计算量,并加入能增强有用特征的注意力机制CBAM,深度信息提取部分扩展了局部特征并多层次特征融合,获取更全面的特征;在输出阶段,为提高效率利用实例语义分割结果过滤背景点,再进行3D关键点检测,最终通过最小二乘拟合算法得到位姿信息。在LineMod、遮挡LineMod和YCB-Video公共数据集上验证,其精度分别达到了99.8%、66.3%和94%,且其网络速度比先前提升了30%,改进的位姿估计方法在保证精度的同时,也提升了速度。

    Abstract:

    Aiming at the problem of how to obtain accurate positional information of objects in unstructured scenes by depth cameras with limited hardware device resources, a target position detection method based on bi-directional fusion of texture and depth information is proposed here. In the learning phase, the two networks adopt the full-flow bidirectional fusion (FFB6D) module, the texture information extraction part introduces the lightweight Ghost module to reduce the computation of the network, and adds the attention mechanism CBAM that can enhance the useful features, and the depth information extraction part extends the local features and multilevel feature fusion to obtain more comprehensive features; in the output phase, in order to improve the efficiency utilizes the instance semantic In the output stage, the segmentation results are utilized to filter the background points for efficiency, and then 3D keypoint detection is performed, and finally the position information is obtained by the least squares fitting algorithm. Validated on LineMod, occluded LineMod and YCB-Video public datasets, its accuracy reaches 99.8%, 66.3% and 94%, respectively, and its network speed is improved by 30% compared with the previous one, the improved position estimation method guarantees the accuracy and improves the speed at the same time.

    参考文献
    相似文献
    引证文献
引用本文
分享
文章指标
  • 点击次数:
  • 下载次数:
历史
  • 收稿日期:2023-08-22
  • 最后修改日期:2024-04-04
  • 录用日期:2024-04-10
  • 在线发布日期: