Multi-label learning with globAl densiTy fusiOn Mapping features

Yumeng Guo, Fu Lai Korris Chung, Guozheng Li

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

Abstract

Multi-label learning, where each instance is assigned to multiple categories simultaneously, is a prevalent problem in data analysis. Previous study approaches typically learn from multi-label data by employing the original feature space in the discrimination process of all class labels. However, this traditional strategy might be suboptimal as the original feature space exists irrelevant or redundant information, which affect the performance of classification. In this paper, we propose another strategy to learn from multi-label data, where reconstructed feature space is exploited to improve the classification performance. Accordingly, an intuitive yet effective algorithm named ATOM, i.e. multi-label learning with globAl densiTy fusiOn Mapping features, is proposed. ATOM firstly reconstructs feature spaces specific to each and no label by conducting cluster analysis on its belonging instances, and then utilizes density fusion to excavate optimum centers from the cluster center union, and finally performs classification by querying the reconstructed feature spaces. Comprehensive experimental results on a total of 12 benchmark data sets clearly validate the superiority of ATOM against other competitive algorithms.
Original languageEnglish
Title of host publication2016 23rd International Conference on Pattern Recognition, ICPR 2016
PublisherIEEE
Pages462-467
Number of pages6
ISBN (Electronic)9781509048472
DOIs
Publication statusPublished - 13 Apr 2017
Event23rd International Conference on Pattern Recognition, ICPR 2016 - Cancun Center, Cancun, Mexico
Duration: 4 Dec 20168 Dec 2016

Conference

Conference23rd International Conference on Pattern Recognition, ICPR 2016
Country/TerritoryMexico
CityCancun
Period4/12/168/12/16

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Cite this