Abstract
With the evidence framework, the regularized linear regression model can be explained as the corresponding MAP problem in this paper, and the general dependency relationships that the optimal parameters in this model with noisy input should follow is then derived. The support vector regression machines Huber-SVR and Norm-r r-SVR are two typical examples of this model and their optimal parameter choices are paid particular attention. It turns out that with the existence of the typical Gaussian noisy input, the parameter μ in Huber-SVR has the linear dependency with the input noise, and the parameter r in the r-SVR has the inversely proportional to the input noise. The theoretical results here will be helpful for us to apply kernel-based regression techniques effectively in practical applications.
Original language | English |
---|---|
Pages (from-to) | 732-741 |
Number of pages | 10 |
Journal | Soft Computing |
Volume | 9 |
Issue number | 10 |
DOIs | |
Publication status | Published - 1 Oct 2005 |
Keywords
- Huber loss functions
- Norm-r loss functions
- Regularized linear regression
- Support vectors
ASJC Scopus subject areas
- Software
- Theoretical Computer Science
- Geometry and Topology