Feature subset selection for efficient AdaBoost training

Chensheng Sun, Jiwei Hu, Kin Man Lam

Research output: Chapter in book / Conference proceedingConference article published in proceeding or bookAcademic researchpeer-review

8 Citations (Scopus)

Abstract

Working with a very large feature set is a challenge in the current machine learning research. In this paper, we address the feature-selection problem in the context of training AdaBoost classifiers. The AdaBoost algorithm embeds a feature selection mechanism based on training a classifier for each feature. Learning the single-feature classifiers is the most time consuming part of AdaBoost training, especially when large number of features are available. To solve this problem, we generate a working feature subset using a novel feature subset selection method based on the partial least square regression, and then train and select from this feature subset. The partial least square method is capable of selecting high-dimensional and highly redundant features. The experiments show that the proposed PLS-based feature-selection method generates sensible feature subsets for AdaBoost in a very efficient way.
Original languageEnglish
Title of host publicationElectronic Proceedings of the 2011 IEEE International Conference on Multimedia and Expo, ICME 2011
DOIs
Publication statusPublished - 7 Nov 2011
Event2011 12th IEEE International Conference on Multimedia and Expo, ICME 2011 - Barcelona, Spain
Duration: 11 Jul 201115 Jul 2011

Conference

Conference2011 12th IEEE International Conference on Multimedia and Expo, ICME 2011
Country/TerritorySpain
CityBarcelona
Period11/07/1115/07/11

Keywords

  • AdaBoost
  • Feature selection
  • Partial Least Squares

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Feature subset selection for efficient AdaBoost training'. Together they form a unique fingerprint.

Cite this