Asymptotically minimax regret procedures in regression model selection and the magnitude of the dimension penalty

Alexander Goldenshluger, Eitan Greenshtein

Research output: Contribution to journalArticlepeer-review

Abstract

This paper addresses the topic of model selection in regression. We emphasize the case of two models, testing which model provides a better prediction based on n observations. Within a family of selection rules, based on maximizing a penalized log-likelihood under a normal model, we search for asymptotically minimax rules over a class script G sign of possible joint distributions of the explanatory and response variables. For the class script G sign of multivariate normal joint distributions it is shown that asymptotically minimax selection rules are close to the AIC selection rule when the models' dimension difference is large. It is further proved that under fairly mild assumptions on script G sign, any asymptotically minimax sequence of procedures satisfies the condition that the difference in their dimension penalties is bounded as the number of observations approaches infinity. The results are then extended to the case of more than two competing models.

Original languageEnglish
Pages (from-to)1620-1637
Number of pages18
JournalAnnals of Statistics
Volume28
Issue number6
DOIs
StatePublished - Dec 2000

Keywords

  • Minimax procedures
  • Model selection
  • Regression

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'Asymptotically minimax regret procedures in regression model selection and the magnitude of the dimension penalty'. Together they form a unique fingerprint.

Cite this