Theoretically optimal parameter choices for support vector regression machines with noisy input

Shitong Wang, Jiagang Zhu, Fu Lai Korris Chung, Qing Lin, Dewen Hu

Research output: Journal article publicationJournal articleAcademic researchpeer-review

15 Citations (Scopus)


With the evidence framework, the regularized linear regression model can be explained as the corresponding MAP problem in this paper, and the general dependency relationships that the optimal parameters in this model with noisy input should follow is then derived. The support vector regression machines Huber-SVR and Norm-r r-SVR are two typical examples of this model and their optimal parameter choices are paid particular attention. It turns out that with the existence of the typical Gaussian noisy input, the parameter μ in Huber-SVR has the linear dependency with the input noise, and the parameter r in the r-SVR has the inversely proportional to the input noise. The theoretical results here will be helpful for us to apply kernel-based regression techniques effectively in practical applications.
Original languageEnglish
Pages (from-to)732-741
Number of pages10
JournalSoft Computing
Issue number10
Publication statusPublished - 1 Oct 2005


  • Huber loss functions
  • Norm-r loss functions
  • Regularized linear regression
  • Support vectors

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Geometry and Topology

Cite this