Automatic choroid layer segmentation using normalized graph cut

Saleha Masood, Bin Sheng, Ping Li, Ruimin Shen, Ruogu Fang, Qiang Wu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

Abstract

Optical coherence tomography is an immersive technique for depth analysis of retinal layers. Automatic choroid layer segmentation is a challenging task because of the low contrast inputs. Existing methodologies carried choroid layer segmentation manually or semi-automatically. The authors proposed automated choroid layer segmentation based on normalised cut algorithm, which aims at extracting the global impression of images and treats the segmentation as a graph partitioning problem. Due to the structure complexity of retinal and choroid layers, the authors employed a series of pre-processing to make the cut more deterministic and accurate. The proposed method divided the image into several patches and ran the normalised cut algorithm on every patch separately. The aim was to avoid insignificant vertical cuts and focus on horizontal cutting. After processing every patch, the authors acquired a global cut on the original image by combining all the patches. Later the authors measured the choroidal thickness which is highly helpful in the diagnosis of several retinal diseases. The results were computed on a total of 525 images of 21 real patients. Experimental results showed that the mean relative error rate of the proposed method was around 0.4 when compared with the manual segmentation performed by the experts.
Original languageEnglish
Pages (from-to)53-59
Number of pages7
JournalIET Image Processing
Volume12
Issue number1
DOIs
Publication statusPublished - Jan 2018
Externally publishedYes

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Automatic choroid layer segmentation using normalized graph cut'. Together they form a unique fingerprint.

Cite this