Sparsity and error analysis of empirical feature-based regularization schemes

Xin Guo, Jun Fan, Ding Xuan Zhou

Research output: Journal article publicationJournal articleAcademic researchpeer-review

3 Citations (Scopus)

Abstract

�2016 Xin Guo, Jun Fan and Ding-Xuan Zhou. We consider a learning algorithm generated by a regularization scheme with a concave regularizer for the purpose of achieving sparsity and good learning rates in a least squares regression setting. The regularization is induced for linear combinations of empirical features, constructed in the literatures of kernel principal component analysis and kernel projection machines, based on kernels and samples. In addition to the separability of the involved optimization problem caused by the empirical features, we carry out sparsity and error analysis, giving bounds in the norm of the reproducing kernel Hilbert space, based on a priori conditions which do not require assumptions on sparsity in terms of any basis or system. In particular, we show that as the concave exponent q of the concave regularizer increases to 1, the learning ability of the algorithm improves. Some numerical simulations for both artificial and real MHC-peptide binding data involving the ℓqregularizer and the SCAD penalty are presented to demonstrate the sparsity and error analysis.
Original languageEnglish
Pages (from-to)1-34
Number of pages34
JournalJournal of Machine Learning Research
Volume17
Publication statusPublished - 1 Jun 2016

Keywords

  • Concave regularizer
  • Regularization with empirical features
  • Reproducing kernel Hilbert space
  • SCAD penalty
  • Sparsity
  • ℓ -penalty q

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Cite this