Rank entropy-based decision trees for monotonic classification

Qinghua Hu, Xunjian Che, Lei Zhang, Dapeng Zhang, Maozu Guo, Daren Yu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

176 Citations (Scopus)

Abstract

In many decision making tasks, values of features and decision are ordinal. Moreover, there is a monotonic constraint that the objects with better feature values should not be assigned to a worse decision class. Such problems are called ordinal classification with monotonicity constraint. Some learning algorithms have been developed to handle this kind of tasks in recent years. However, experiments show that these algorithms are sensitive to noisy samples and do not work well in real-world applications. In this work, we introduce a new measure of feature quality, called rank mutual information (RMI), which combines the advantage of robustness of Shannon's entropy with the ability of dominance rough sets in extracting ordinal structures from monotonic data sets. Then, we design a decision tree algorithm (REMT) based on rank mutual information. The theoretic and experimental analysis shows that the proposed algorithm can get monotonically consistent decision trees, if training samples are monotonically consistent. Its performance is still good when data are contaminated with noise.
Original languageEnglish
Article number5936071
Pages (from-to)2052-2064
Number of pages13
JournalIEEE Transactions on Knowledge and Data Engineering
Volume24
Issue number11
DOIs
Publication statusPublished - 5 Oct 2012

Keywords

  • decision tree
  • Monotonic classification
  • rank entropy
  • rank mutual information

ASJC Scopus subject areas

  • Information Systems
  • Computer Science Applications
  • Computational Theory and Mathematics

Fingerprint

Dive into the research topics of 'Rank entropy-based decision trees for monotonic classification'. Together they form a unique fingerprint.

Cite this