Penalized Nonparametric Scalar-on-Function Regression via Principal Coordinates

Philip T. Reiss, David L. Miller, Pei Shien Wu, Wen Yu Hua

Research output: Contribution to journalArticlepeer-review


A number of classical approaches to nonparametric regression have recently been extended to the case of functional predictors. This article introduces a new method of this type, which extends intermediate-rank penalized smoothing to scalar-on-function regression. In the proposed method, which we call principal coordinate ridge regression, one regresses the response on leading principal coordinates defined by a relevant distance among the functional predictors, while applying a ridge penalty. Our publicly available implementation, based on generalized additive modeling software, allows for fast optimal tuning parameter selection and for extensions to multiple functional predictors, exponential family-valued responses, and mixed-effects models. In an application to signature verification data, principal coordinate ridge regression, with dynamic time warping distance used to define the principal coordinates, is shown to outperform a functional generalized linear model. Supplementary materials for this article are available online.

Original languageEnglish
Pages (from-to)569-578
Number of pages10
JournalJournal of Computational and Graphical Statistics
Issue number3
StatePublished - 3 Jul 2017

Bibliographical note

Publisher Copyright:
© 2017 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America.


  • Dynamic time warping
  • Functional regression
  • Generalized additive model
  • Kernel ridge regression
  • Multidimensional scaling

ASJC Scopus subject areas

  • Discrete Mathematics and Combinatorics
  • Statistics and Probability
  • Statistics, Probability and Uncertainty


Dive into the research topics of 'Penalized Nonparametric Scalar-on-Function Regression via Principal Coordinates'. Together they form a unique fingerprint.

Cite this