Penalized Partial Least Squares with Applications to B-Spline Transformations and Functional Data
View Researcher's Other CodesDisclaimer: The provided code links for this paper are external links. Science Nest has no responsibility for the accuracy, legality or content of these links. Also, by downloading this code(s), you agree to comply with the terms of use as set out by the author(s) of the code(s).
Authors | Nicole Krämer, Anne-Laure Boulesteix, Gerhard Tutz |
Journal/Conference Name | Chemometrics and Intelligent Laboratory Systems |
Paper Category | Other |
Paper Abstract | We propose a novel framework that combines penalization techniques with Partial Least Squares (PLS). We focus on two important applications. (1) We combine PLS with a roughness penalty to estimate high-dimensional regression problems with functional predictors and scalar response. (2) Starting with an additive model, we expand each variable in terms of a generous number of B-spline basis functions. To prevent overfitting, we estimate the model by applying a penalized version of PLS. We gain additional model flexibility by incorporating a sparsity penalty. Both applications can be formulated in terms of a unified algorithm called Penalized Partial Least Squares, which can be computed virtually as fast as PLS using the kernel trick. Furthermore, we prove a close connection of penalized PLS to preconditioned linear systems. In experiments, we show the benefits of our method to noisy functional data and to sparse nonlinear regression models. |
Date of publication | 2008 |
Code Programming Language | R |
Comment |