Files

Abstract

Generalized linear mixed models or multilevel regression models have become increasingly popular. Several methods have been proposed for estimating such models. However, to date there is no single method that can be assumed to work well in all circumstances in terms of both parameter recovery and computational efficiency. Stata’s xt commands for two-level generalized linear mixed models (e.g., xtlogit) employ Gauss–Hermite quadrature to evaluate and maximize the marginal log likelihood. The method generally works very well, and often better than common contenders such as MQL and PQL, but there are cases where quadrature performs poorly. Adaptive quadrature has been suggested to overcome these problems in the two-level case. We have recently implemented a multilevel version of this method in gllamm, a program that fits a large class of multilevel latent variable models including multilevel generalized linear mixed models. As far as we know, this is the first time that adaptive quadrature has been proposed for multilevel models. We show that adaptive quadrature works well in problems where ordinary quadrature fails. Furthermore, even when ordinary quadrature works, adaptive quadrature is often computationally more efficient since it requires fewer quadrature points to achieve the same precision.

Details

PDF

Statistics

from
to
Export
Download Full History