The Gauss–Markov theorem specifies the conditions under which the ordinary least squares (OLS) estimator is also the best linear unbiased (BLU) estimator. Because these BLU estimator properties are guaranteed by the Gauss–Markov theorem under general conditions that are often encountered in practice, ordinary least squares has become what George Stigler describes as the “automobile of modern statistical analysis.” Furthermore, many of the most important advances in regression analysis have been direct generalizations of ordinary least squares under the Gauss–Markov theorem to even more general conditions. For example, weighted least squares, generalized least squares, finite distributed lag models, first-differenced estimators, and fixed-effect panel models all extend the finite-sample results of the Gauss–Markov theorem to conditions beyond the classical linear regression model. After a brief discussion of ...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles