Abstract
Accurate segmentation of prostate tumors from multimodal magnetic resonance (MR) images is crucial for diagnosis and treatment of prostate cancer. However, the robustness of existing segmentation methods is limited, mainly because these methods 1) fail to adaptively assess subject-specific information of each MR modality for accurate tumor delineation, and 2) lack effective utilization of inter-slice information across thick slices in MR images to segment tumor as a whole 3D volume. In this work, we propose a two-stage neighboraware multi-modal adaptive learning network (NaMa) for accurate prostate tumor segmentation from multimodal anisotropic MR images. In particular, in the first stage, we apply subject-specific multi-modal fusion in each slice by developing a novel modalityinformativeness adaptive learning (MIAL) module for selecting and adaptively fusing informative representation of each modality based on inter-modality correlations. In the second stage, we exploit inter-slice feature correlations to derive volumetric tumor segmentation. Specifically, we first use a Unet variant with sequence layers to coarsely capture slice relationship at a global scale, and further generate an activation map for each slice. Then, we introduce an activation mapping guidance (AMG) module to refine slice-wise representation (via information from adjacent slices) for consistent tumor segmentation across neighboring slices. Besides, during the network training, we further apply a random mask strategy to each MR modality to improve feature representation efficiency. Experiments on both in-house and public (PICAI) multi-modal prostate tumor datasets show that our proposed NaMa performs better than state-of-the-art methods.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 38th AAAI Conference on Artificial Intelligence |
| Editors | Michael Wooldridge, Jennifer Dy, Sriraam Natarajan |
| Publisher | Association for the Advancement of Artificial Intelligence |
| Pages | 4198-4206 |
| Number of pages | 9 |
| ISBN (Electronic) | 9781577358879, 1577358872 |
| DOIs | |
| Publication status | Published - 24 Mar 2024 |
| Externally published | Yes |
| Event | 38th AAAI Conference on Artificial Intelligence, AAAI 2024 - Vancouver, Canada Duration: 20 Feb 2024 → 27 Feb 2024 |
Publication series
| Name | Proceedings of the AAAI Conference on Artificial Intelligence |
|---|---|
| Number | 5 |
| Volume | 38 |
| ISSN (Print) | 2159-5399 |
| ISSN (Electronic) | 2374-3468 |
Conference
| Conference | 38th AAAI Conference on Artificial Intelligence, AAAI 2024 |
|---|---|
| Country/Territory | Canada |
| City | Vancouver |
| Period | 20/02/24 → 27/02/24 |
UN SDGs
This output contributes to the following UN Sustainable Development Goals (SDGs)
-
SDG 3 Good Health and Well-being
ASJC Scopus subject areas
- Artificial Intelligence
Fingerprint
Dive into the research topics of 'NaMa: Neighbor-Aware Multi-Modal Adaptive Learning for Prostate Tumor Segmentation on Anisotropic MR Images'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver