The term sum of squares (SS) is an abbreviated term for “sum of squared deviations of values from their mean.” As such, the SS defines a descriptive measure of variation and provides a central component for calculating the variance of numerical values. In general, variance (or mean squares) is defined as the SS divided by the SS’s degrees of freedom. In the descriptive case, that is, in cases in which only the variance of given values is of interest (and no inferences on the variance of values outside the given data set are needed), the SS’s degrees of freedom are simply the number of values. Thus, descriptive variance is defined as the SS divided by the number of values. Decomposing sum of squares ...
Looks like you do not have access to this content.