Abstract
We proposed the use of coordinate kernel polynomials in kernel regression. This new approach, called coordinate kernel polynomial regression, can simultaneously identify active variables and effective interactive components. Reparametrization refinement is found critical to improve the modeling accuracy and prediction power. The post-training component selection allows one to identify effective interactive components. Generalization error bounds are used to explain the effectiveness of the algorithm from a learning theory perspective and simulation studies are used to show its empirical effectiveness.
Original language | English |
---|---|
Pages (from-to) | 263-277 |
Number of pages | 15 |
Journal | Mathematical Foundations of Computing |
Volume | 3 |
Issue number | 4 |
DOIs | |
Publication status | Published - Nov 2020 |
Keywords
- coordinate kernel polynomial model
- generalization
- information criterion
- Interactive component
- kernel method
- reparametrization
ASJC Scopus subject areas
- Artificial Intelligence
- Computational Theory and Mathematics
- Computational Mathematics
- Theoretical Computer Science