Tensor product splines and functional principal components

Research output: Contribution to journalArticlepeer-review


Functional principal component analysis for sparse longitudinal data usually proceeds by first smoothing the covariance surface, and then obtaining an eigendecomposition of the associated covariance operator. Here we consider the use of penalized tensor product splines for the initial smoothing step. Drawing on a result regarding finite-rank symmetric integral operators, we derive an explicit spline representation of the estimated eigenfunctions, and show that the effect of penalization can be notably disparate for alternative approaches to tensor product smoothing. The latter phenomenon is illustrated with two data sets derived from magnetic resonance imaging of the human brain.

Original languageEnglish
Pages (from-to)1-12
Number of pages12
JournalJournal of Statistical Planning and Inference
StatePublished - Sep 2020

Bibliographical note

Publisher Copyright:
© 2019 The Authors


  • Bivariate smoothing
  • Covariance function
  • Covariance operator
  • Eigenfunction
  • Roughness penalty

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty
  • Applied Mathematics


Dive into the research topics of 'Tensor product splines and functional principal components'. Together they form a unique fingerprint.

Cite this