Automated Melanoma Recognition in Dermoscopy Images via Very Deep Residual Networks

Lequan Yu, Hao Chen, Qi Dou, Jing Qin, Pheng Ann Heng

Research output: Journal article publicationJournal articleAcademic researchpeer-review

702 Citations (Scopus)

Abstract

Automated melanoma recognition in dermoscopy images is a very challenging task due to the low contrast of skin lesions, the huge intraclass variation of melanomas, the high degree of visual similarity between melanoma and non-melanoma lesions, and the existence of many artifacts in the image. In order to meet these challenges, we propose a novel method for melanoma recognition by leveraging very deep convolutional neural networks (CNNs). Compared with existing methods employing either low-level hand-crafted features or CNNs with shallower architectures, our substantially deeper networks (more than 50 layers) can acquire richer and more discriminative features for more accurate recognition. To take full advantage of very deep networks, we propose a set of schemes to ensure effective training and learning under limited training data. First, we apply the residual learning to cope with the degradation and overfitting problems when a network goes deeper. This technique can ensure that our networks benefit from the performance gains achieved by increasing network depth. Then, we construct a fully convolutional residual network (FCRN) for accurate skin lesion segmentation, and further enhance its capability by incorporating a multi-scale contextual information integration scheme. Finally, we seamlessly integrate the proposed FCRN (for segmentation) and other very deep residual networks (for classification) to form a two-stage framework. This framework enables the classification network to extract more representative and specific features based on segmented results instead of the whole dermoscopy images, further alleviating the insufficiency of training data. The proposed framework is extensively evaluated on ISBI 2016 Skin Lesion Analysis Towards Melanoma Detection Challenge dataset. Experimental results demonstrate the significant performance gains of the proposed framework, ranking the first in classification and the second in segmentation among 25 teams and 28 teams, respectively. This study corroborates that very deep CNNs with effective training mechanisms can be employed to solve complicated medical image analysis tasks, even with limited training data.
Original languageEnglish
Article number7792699
Pages (from-to)994-1004
Number of pages11
JournalIEEE Transactions on Medical Imaging
Volume36
Issue number4
DOIs
Publication statusPublished - 1 Apr 2017

Keywords

  • Automated melanoma recognition
  • fully convolutional neural networks
  • residual learning
  • skin lesion analysis
  • very deep convolutional neural networks

ASJC Scopus subject areas

  • Software
  • Radiological and Ultrasound Technology
  • Computer Science Applications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Automated Melanoma Recognition in Dermoscopy Images via Very Deep Residual Networks'. Together they form a unique fingerprint.

Cite this