Scaling up minimum enclosing ball with total soft margin for training on large datasets

Wenjun Hu, Fu Lai Korris Chung, Shitong Wang, Wenhao Ying

Research output: Journal article publicationJournal articleAcademic researchpeer-review

4 Citations (Scopus)

Abstract

Recent research indicates that the standard Minimum Enclosing Ball (MEB) or the center-constrained MEB can be used for effective training on large datasets by employing the core vector machine (CVM) or generalized CVM (GCVM). However, for another extensively-used MEB, i.e., MEB with total soft margin (T-MEB for brevity), we cannot directly employ the CVM or GCVM to realize its fast training for large datasets due to the fact that the involved inequality constraint is violated. In this paper, a fast learning algorithm called FL-TMEB for scaling up T-MEB is presented. First, FL-TMEB slightly relaxes the constraints in TMEB such that it can be equivalent to the corresponding center-constrained MEB, which can be solved with the corresponding Core Set (CS) by CVM. Then, with the help of the sub-optimal solution theorem about T-MEB, FL-TMEB attempts to obtain the extended core set (ECS) by including the neighbors of some samples in the CS into the ECS. Finally, FL-TMEB takes the optimal weights of ECS as the approximation solution of T-MEB. Experimental results on UCI and USPS datasets demonstrate that the proposed method is effective.
Original languageEnglish
Pages (from-to)120-128
Number of pages9
JournalNeural Networks
Volume36
DOIs
Publication statusPublished - 1 Dec 2012

Keywords

  • Core set
  • Core vector machine
  • Extended core set
  • Large datasets
  • Minimum Enclosing Ball (MEB)
  • Soft margin

ASJC Scopus subject areas

  • Cognitive Neuroscience
  • Artificial Intelligence

Cite this