Regularisation Parameter Selection Via Bootstrapping

Zhen Pang, Bingqing Lin, Jiming Jiang

Research output: Journal article publicationJournal articleAcademic researchpeer-review

3 Citations (Scopus)


Published by John Wiley & Sons Australia Pty Ltd. Penalised likelihood methods, such as the least absolute shrinkage and selection operator (Lasso) and the smoothly clipped absolute deviation penalty, have become widely used for variable selection in recent years. These methods impose penalties on regression coefficients to shrink a subset of them towards zero to achieve parameter estimation and model selection simultaneously. The amount of shrinkage is controlled by the regularisation parameter. Popular approaches for choosing the regularisation parameter include cross-validation, various information criteria and bootstrapping methods that are based on mean square error. In this paper, a new data-driven method for choosing the regularisation parameter is proposed and the consistency of the method is established. It holds not only for the usual fixed-dimensional case but also for the divergent setting. Simulation results show that the new method outperforms other popular approaches. An application of the proposed method to motif discovery in gene expression analysis is included in this paper.
Original languageEnglish
Pages (from-to)335-356
Number of pages22
JournalAustralian and New Zealand Journal of Statistics
Issue number3
Publication statusPublished - 1 Sept 2016


  • Lasso
  • likelihood
  • SCAD

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Regularisation Parameter Selection Via Bootstrapping'. Together they form a unique fingerprint.

Cite this