Oracle inequalities for sparse additive quantile regression in reproducing kernel hilbert space

Shaogao Lv, Huazhen Lin, Heng Lian, Jian Huang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

20 Citations (Scopus)

Abstract

This paper considers the estimation of the sparse additive quantile regression (SAQR) in high-dimensional settings. Given the nonsmooth nature of the quantile loss function and the nonparametric complexities of the component function estimation, it is challenging to analyze the theoretical properties of ultrahigh-dimensional SAQR. We propose a regularized learning approach with a two-fold Lasso-type regularization in a reproducing kernel Hilbert space (RKHS) for SAQR. We establish nonasymptotic oracle inequalities for the excess risk of the proposed estimator without any coherent conditions. If additional assumptions including an extension of the restricted eigenvalue condition are satisfied, the proposed method enjoys sharp oracle rates without the light tail requirement. In particular, the proposed estimator achieves the minimax lower bounds established for sparse additive mean regression. As a by-product, we also establish the concentration inequality for estimating the population mean when the general Lipschitz loss is involved. The practical effectiveness of the new method is demonstrated by competitive numerical results.

Original languageEnglish
Pages (from-to)781-813
Number of pages33
JournalAnnals of Statistics
Volume46
Issue number2
DOIs
Publication statusPublished - 1 Apr 2018
Externally publishedYes

Keywords

  • Additive models
  • Quantile regression
  • Regularization methods
  • Reproducing kernel Hilbert space
  • Sparsity

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this