A Comparative Study of Various Supervised Learning Approaches to Selective Omission in a Road Network

Qi Zhou, Zhilin Li

Research output: Journal article publicationJournal articleAcademic researchpeer-review

20 Citations (Scopus)


Selective omission is necessary for road network generalisation. This study investigates the use of supervised learning approaches for selective omission in a road network. To be specific, at first, the properties to measure the importance of a road in the network are viewed as input attributes, and the decision of such a road is retained or not at a specific scale is viewed as an output class; then, a number of samples with known input and output are used to train a classifier; finally, this classifier can be used to determine whether other roads to be retained or not. In this study, a total of nine supervised learning approaches, i.e., ID3, C4·5, CRT, Random Tree, support vector machine (SVM), naive Bayes (NB), K-nearest neighbour (KNN), multilayer perception (MP) and binary logistic regression (BLR), are applied to three road networks for selective omission. The performances of these approaches are evaluated by both quantitative assessment and visual inception. Results show that: (1) in most cases, these approaches are effective and their classification accuracy is between 70% and 90%; (2) most of these approaches have similar performances, and they do not have any statistically significant difference; (3) but sometimes, ID3 and BLR performs significantly better than NB and SVM; NB and KNN perform significantly worse than MP, SVM and BLR.
Original languageEnglish
Pages (from-to)254-264
Number of pages11
JournalCartographic Journal
Issue number3
Publication statusPublished - 3 Jul 2017


  • road network
  • selective omission
  • supervised learning

ASJC Scopus subject areas

  • Earth-Surface Processes


Dive into the research topics of 'A Comparative Study of Various Supervised Learning Approaches to Selective Omission in a Road Network'. Together they form a unique fingerprint.

Cite this