Abstract
In many decision making tasks, values of features and decision are ordinal. Moreover, there is a monotonic constraint that the objects with better feature values should not be assigned to a worse decision class. Such problems are called ordinal classification with monotonicity constraint. Some learning algorithms have been developed to handle this kind of tasks in recent years. However, experiments show that these algorithms are sensitive to noisy samples and do not work well in real-world applications. In this work, we introduce a new measure of feature quality, called rank mutual information (RMI), which combines the advantage of robustness of Shannon's entropy with the ability of dominance rough sets in extracting ordinal structures from monotonic data sets. Then, we design a decision tree algorithm (REMT) based on rank mutual information. The theoretic and experimental analysis shows that the proposed algorithm can get monotonically consistent decision trees, if training samples are monotonically consistent. Its performance is still good when data are contaminated with noise.
Original language | English |
---|---|
Article number | 5936071 |
Pages (from-to) | 2052-2064 |
Number of pages | 13 |
Journal | IEEE Transactions on Knowledge and Data Engineering |
Volume | 24 |
Issue number | 11 |
DOIs | |
Publication status | Published - 5 Oct 2012 |
Keywords
- decision tree
- Monotonic classification
- rank entropy
- rank mutual information
ASJC Scopus subject areas
- Information Systems
- Computer Science Applications
- Computational Theory and Mathematics