Regression models for functional responses and scalar predictors are often fitted by means of basis functions, with quadratic roughness penalties applied to avoid overfitting. The fitting approach described by Ramsay and Silverman in the 1990s amounts to a penalized ordinary least squares (P-OLS) estimator of the coefficient functions. We recast this estimator as a generalized ridge regression estimator, and present a penalized generalized least squares (P-GLS) alternative. We describe algorithms by which both estimators can be implemented, with automatic selection of optimal smoothing parameters, in a more computationally efficient manner than has heretofore been available. We discuss pointwise confidence intervals for the coefficient functions, simultaneous inference by permutation tests, and model selection, including a novel notion of pointwise model selection. P-OLS and P-GLS are compared in a simulation study. Our methods are illustrated with an analysis of age effects in a functional magnetic resonance imaging data set, as well as a reanalysis of a now-classic Canadian weather data set. An R package implementing the methods is publicly available.
Bibliographical noteFunding Information:
KEYWORDS: cross-validation, functional data analysis, functional connectivity, functional linear model, smoothing parameters, varying-coefficient model Author Notes: The authors are grateful to the referees for very helpful feedback, to Mike Milham, Eva Petkova, Thad Tarpey and Simon Wood for highly informative discussions, and to Giles Hooker for advice on using the R package fda. The first author's research is supported in part by National Science Foundation grant DMS-0907017.
- functional connectivity
- functional data analysis
- functional linear model
- smoothing parameters
- varying-coefficient model
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty