Sums of Squares
In mathematics, the sum of squares can refer to any set of numbers that has been squared and then added together (i.e., σX2). However, in the statistical analysis of quantitative data, the sum of squared differences or deviations (normally the difference between a score and the mean) is of particular interest. This formulation is usually referred to in research by the term sums of squares. This type of sum of squared values is extremely important in the analysis of the variability in data and in understanding the relationship between variables.
If there is no variability in a set of numbers, then they are all the same. However, this is unlikely to be the case in any research. Indeed, researchers actively investigate changes in their data resulting ...
Looks like you do not have access to this content.
Reader's Guide
Descriptive Statistics
Distributions
Graphical Displays of Data
Hypothesis Testing
Important Publications
Inferential Statistics
Item Response Theory
Mathematical Concepts
Measurement Concepts
Organizations
Publishing
Qualitative Research
Reliability of Scores
Research Design Concepts
Research Designs
Research Ethics
Research Process
Research Validity Issues
Sampling
Scaling
Software Applications
Statistical Assumptions
Statistical Concepts
Statistical Procedures
Statistical Tests
Theories, Laws, and Principles
Types of Variables
Validity of Scores
- All
- A
- B
- C
- D
- E
- F
- G
- H
- I
- J
- K
- L
- M
- N
- O
- P
- Q
- R
- S
- T
- U
- V
- W
- X
- Y
- Z