Files
Abstract
Information criteria (IC) are used widely to choose between competing alternative models. When these models have the same number of parameters, the choice simplifies to the model with the largest maximized loglikelihood. By studying the problem of selecting either first-order autoregressive or first-order moving average disturbances in the linear regression model, we present clear evidence that a particular model can be unfairly favoured because of the shape or functional form of its log-likelihood. We also find that the presence of nuisance parameters can adversely affect the probabilities of correct selection. The use of Monte Carlo methods to find more appropriate penalties and the application of IC procedures to marginal likelihoods rather than conventional likelihoods is found to result in improved selection probabilities in small samples.