Abstract
The problem of learning from data involving function values and gradients is considered in a framework of least-square regularized regression in reproducing kernel Hilbert spaces. The algorithm is implemented by a linear system with the coefficient matrix involving both block matrices for generating Graph Laplacians and Hessians. The additional data for function gradients improve learning performance of the algorithm. Error analysis is done by means of sampling operators for sample error and integral operators in Sobolev spaces for approximation error.
Original language | English |
---|---|
Pages (from-to) | 3046-3059 |
Number of pages | 14 |
Journal | Journal of Computational and Applied Mathematics |
Volume | 233 |
Issue number | 11 |
DOIs | |
Publication status | Published - 1 Apr 2010 |
Externally published | Yes |
Keywords
- Hermite learning
- Integral operator
- Learning theory
- Representer theorem
- Reproducing kernel Hilbert spaces
- Sampling operator
ASJC Scopus subject areas
- Computational Mathematics
- Applied Mathematics