Abstract
This paper addresses the topic of model selection in regression. We emphasize the case of two models, testing which model provides a better prediction based on n observations. Within a family of selection rules, based on maximizing a penalized log-likelihood under a normal model, we search for asymptotically minimax rules over a class script G sign of possible joint distributions of the explanatory and response variables. For the class script G sign of multivariate normal joint distributions it is shown that asymptotically minimax selection rules are close to the AIC selection rule when the models' dimension difference is large. It is further proved that under fairly mild assumptions on script G sign, any asymptotically minimax sequence of procedures satisfies the condition that the difference in their dimension penalties is bounded as the number of observations approaches infinity. The results are then extended to the case of more than two competing models.
Original language | English |
---|---|
Pages (from-to) | 1620-1637 |
Number of pages | 18 |
Journal | Annals of Statistics |
Volume | 28 |
Issue number | 6 |
DOIs | |
State | Published - Dec 2000 |
Keywords
- Minimax procedures
- Model selection
- Regression
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty