基于 RA-Unet 的 CT 图像肝脏肿瘤分割
DOI:
CSTR:
作者:
作者单位:

作者简介:

通讯作者:

中图分类号:

TP391. 41 TH89

基金项目:

国家自然科学基金(62272161,62076256)、湖南省自然科学基金(2021JJ30275)、湖南省教育厅资助科研项目(20B239)资助


Liver tumor segmentation from CT images based on RA-Unet
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 图/表
  • |
  • 访问统计
  • |
  • 参考文献
  • |
  • 相似文献
  • |
  • 引证文献
  • |
  • 资源附件
  • |
  • 文章评论
    摘要:

    CT 图像肝脏肿瘤分割是进行肝癌前期诊断、肿瘤负荷分析和放射治疗的重要前提。 为实现肿瘤的精确自动分割,提出 一种融合残差模块和注意力机制的深度 U 形网络。 该网络首先在跳跃连接层中引入一条带有反卷积与激活操作的残差路径 和卷积模块,实现图像特征的分离传递以及高级表征,确保跳跃连接层主要传递图像边缘信息和小目标全局信息,其次在解码 路径中引入注意力机制,通过将跳跃连接层与反卷积解码获得的特征信息赋予不同权重,进一步增强肿瘤特征,抑制其他无关 信息。 提出方法在 LiTS 数据集上获得的全局 Dice 值高达 86. 71% ,明显高于其他多种现有方法,且相较于其他方法,该方法对 于小尺寸、对比度低、边界模糊的肿瘤具有明显的分割优势。

    Abstract:

    Liver tumor segmentation from CT image is an important prerequisite for early diagnosis, tumor burden analysis, and radiotherapy of liver cancer. To segment tumors accurately and automatically, a deep U-shaped network based on the residual block and attention mechanism is proposed. In this network, a residual path with deconvolution and activation operations together with a convolution module is first introduced in the skip connection to separate image features and obtain their high-level representation, which ensures that the skip connections mainly transmit the information of image edges and global information of small targets. Then, the attention mechanism is introduced in the decoding path to further enhance tumor feature and suppress irrelevant information by assigning different weights to the feature information obtained by skip connections and deconvolution decoding. The global Dice coefficient achieved by the proposed method on LiTS dataset is as high as 86. 71% , which is obviously higher than many other existing methods. Compared with other methods, the proposed method has obvious advantages in segmenting tumors with small size, low contrast, and blurred boundaries.

    参考文献
    相似文献
    引证文献
引用本文

邸拴虎,杨文瀚,廖 苗,赵于前,杨 振.基于 RA-Unet 的 CT 图像肝脏肿瘤分割[J].仪器仪表学报,2022,43(8):65-72

复制
分享
文章指标
  • 点击次数:
  • 下载次数:
  • HTML阅读次数:
  • 引用次数:
历史
  • 收稿日期:
  • 最后修改日期:
  • 录用日期:
  • 在线发布日期: 2023-02-06
  • 出版日期:
文章二维码