基于非局部低秩约束的改进灵敏度编码重建算法
作者:
作者单位:

昆明理工大学信息工程与自动化学院,昆明 650500

作者简介:

通讯作者:

基金项目:

国家自然科学基金(61861023)。


An Improved Sensitivity Encoding Reconstruction Algorithm Based on Nonlocal Low-Rank Constraints
Author:
Affiliation:

Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, China

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
    摘要:

    灵敏度编码(Sensitivity encoding, SENSE)是一种应用广泛的并行磁共振成像(Magnetic resonance imaging, MRI)重建模型。目前已有的针对SENSE模型的改进方法的重建图像中依然有较多伪影,尤其在较高加速因子时很难重建出比较清晰的图像。因此,本文基于非局部低秩约束(Nonlocal low-rank, NLR),提出了一种改进的SENSE模型,称为NLR-SENSE。该模型使用加权核范数作为秩代理函数,并使用交替方向乘子法(Alternating direction multiplier method, ADMM)进行求解。仿真实验结果表明,与其他几种并行磁共振成像方法相比,NLR-SENSE方法在视觉比较和3个不同的客观指标上均表现优异,能有效提升重建图像的质量。

    Abstract:

    Sensitivity encoding (SENSE) is a widely used parallel magnetic resonance imaging (MRI) reconstruction model. Many improved models have been proposed to improve the reconstruction performance of SENSE. However, the reconstructed images of these improved methods still have many artifacts. Especially, it is difficult to reconstruct a clearer image when the acceleration factor is higher. Therefore, based on nonlocal low-rank(NLR) constraints, this paper proposes an improved SENSE model, named NLR-SENSE model, which can effectively improve the quality of parallel MRI reconstructed images. We adopt the weighted kernel norm as the rank surrogate function, and use the alternating direction multiplier method (ADMM) to solve the NLR-SENSE model. Simulation results show that, compared with several other parallel MRI reconstruction methods, the NLR-SENSE model performs better in visual comparison and three different objective metrics, and can effectively improve the quality of the reconstructed image.

    参考文献
    相似文献
    引证文献
引用本文

潘婷,段继忠.基于非局部低秩约束的改进灵敏度编码重建算法[J].数据采集与处理,2023,38(1):193-208

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
历史
  • 收稿日期:2021-11-10
  • 最后修改日期:2022-01-08
  • 录用日期:
  • 在线发布日期: 2023-01-25