TY - JOUR
T1 - Deep Attentive Features for Prostate Segmentation in 3D Transrectal Ultrasound
AU - Wang, Yi
AU - Ni, Dong
AU - Dou, Haoran
AU - Hu, Xiaowei
AU - Zhu, Lei
AU - Yang, Xin
AU - Xu, Ming
AU - Qin, Jing
AU - Heng, Pheng Ann
AU - Wang, Tianfu
N1 - Funding Information:
Manuscript received February 28, 2019; revised April 8, 2019 and April 16, 2019; accepted April 20, 2019. Date of publication April 25, 2019; date of current version November 26, 2019. This work was supported in part by the National Natural Science Foundation of China under Grant 61701312 and Grant 61571304, in part by the Natural Science Foundation of Shenzhen University under Grant 2018010, in part by the Shenzhen Peacock Plan under Grant KQTD2016053112051497, and in part by the grant from the Research Grants Council of Hong Kong Special Administrative Region under Grant 14225616. (Corresponding author: Dong Ni.) Y. Wang, H. Dou, T. Wang, and D. Ni are with the National-Regional Key Technology Engineering Laboratory for Medical Ultrasound, Guangdong Key Laboratory for Biomedical Measurements and Ultrasound Imaging, Health Science Center, School of Biomedical Engineering, Shen-zhen University, Shenzhen 518060, China, and also with the Medical UltraSound Image Computing Lab, Shenzhen 518060, China (e-mail: [email protected]).
Publisher Copyright:
© 2019 IEEE.
PY - 2019/12
Y1 - 2019/12
N2 - Automatic prostate segmentation in transrectal ultrasound (TRUS) images is of essential importance for image-guided prostate interventions and treatment planning. However, developing such automatic solutions remains very challenging due to the missing/ambiguous boundary and inhomogeneous intensity distribution of the prostate in TRUS, as well as the large variability in prostate shapes. This paper develops a novel 3D deep neural network equipped with attention modules for better prostate segmentation in TRUS by fully exploiting the complementary information encoded in different layers of the convolutional neural network (CNN). Our attention module utilizes the attention mechanism to selectively leverage the multi-level features integrated from different layers to refine the features at each individual layer, suppressing the non-prostate noise at shallow layers of the CNN and increasing more prostate details into features at deep layers. Experimental results on challenging 3D TRUS volumes show that our method attains satisfactory segmentation performance. The proposed attention mechanism is a general strategy to aggregate multi-level deep features and has the potential to be used for other medical image segmentation tasks.
AB - Automatic prostate segmentation in transrectal ultrasound (TRUS) images is of essential importance for image-guided prostate interventions and treatment planning. However, developing such automatic solutions remains very challenging due to the missing/ambiguous boundary and inhomogeneous intensity distribution of the prostate in TRUS, as well as the large variability in prostate shapes. This paper develops a novel 3D deep neural network equipped with attention modules for better prostate segmentation in TRUS by fully exploiting the complementary information encoded in different layers of the convolutional neural network (CNN). Our attention module utilizes the attention mechanism to selectively leverage the multi-level features integrated from different layers to refine the features at each individual layer, suppressing the non-prostate noise at shallow layers of the CNN and increasing more prostate details into features at deep layers. Experimental results on challenging 3D TRUS volumes show that our method attains satisfactory segmentation performance. The proposed attention mechanism is a general strategy to aggregate multi-level deep features and has the potential to be used for other medical image segmentation tasks.
KW - 3D segmentation
KW - Attention mechanisms
KW - deep features
KW - feature pyramid network
KW - transrectal ultrasound
UR - http://www.scopus.com/inward/record.url?scp=85076123445&partnerID=8YFLogxK
U2 - 10.1109/TMI.2019.2913184
DO - 10.1109/TMI.2019.2913184
M3 - Journal article
C2 - 31021793
AN - SCOPUS:85076123445
SN - 0278-0062
VL - 38
SP - 2768
EP - 2778
JO - IEEE Transactions on Medical Imaging
JF - IEEE Transactions on Medical Imaging
IS - 12
M1 - 8698868
ER -