Gauss–Markov Theorem
The Gauss–Markov theorem specifies the conditions under which the ordinary least squares (OLS) estimator is also the best linear unbiased (BLU) estimator. Because these BLU estimator properties are guaranteed by the Gauss–Markov theorem under general conditions that are often encountered in practice, ordinary least squares has become what George Stigler describes as the “automobile of modern statistical analysis.” Furthermore, many of the most important advances in regression analysis have been direct generalizations of ordinary least squares under the Gauss–Markov theorem to even more general conditions. For example, weighted least squares, generalized least squares, finite distributed lag models, first-differenced estimators, and fixed-effect panel models all extend the finite-sample results of the Gauss–Markov theorem to conditions beyond the classical linear regression model. After a brief discussion of ...
Looks like you do not have access to this content.
Reader's Guide
Descriptive Statistics
Distributions
Graphical Displays of Data
Hypothesis Testing
Important Publications
Inferential Statistics
Item Response Theory
Mathematical Concepts
Measurement Concepts
Organizations
Publishing
Qualitative Research
Reliability of Scores
Research Design Concepts
Research Designs
Research Ethics
Research Process
Research Validity Issues
Sampling
Scaling
Software Applications
Statistical Assumptions
Statistical Concepts
Statistical Procedures
Statistical Tests
Theories, Laws, and Principles
Types of Variables
Validity of Scores
- All
- A
- B
- C
- D
- E
- F
- G
- H
- I
- J
- K
- L
- M
- N
- O
- P
- Q
- R
- S
- T
- U
- V
- W
- X
- Y
- Z