The Gauss–Markov theorem specifies the conditions under which the ordinary least squares (OLS) estimator is also the best linear unbiased (BLU) estimator. Because these BLU estimator properties are guaranteed by the Gauss–Markov theorem under general conditions that are often encountered in practice, ordinary least squares has become what George Stigler describes as the “automobile of modern statistical analysis.” Furthermore, many of the most important advances in regression analysis have been direct generalizations of ordinary least squares under the Gauss–Markov theorem to even more general conditions. For example, weighted least squares, generalized least squares, finite distributed lag models, first-differenced estimators, and fixed-effect panel models all extend the finite-sample results of the Gauss–Markov theorem to conditions beyond the classical linear regression model. After a brief discussion of ...
Looks like you do not have access to this content.