Decomposing Sums of Squares

The term sum of squares (SS) is an abbreviated term for “sum of squared deviations of values from their mean.” As such, the SS defines a descriptive measure of variation and provides a central component for calculating the variance of numerical values. In general, variance (or mean squares) is defined as the SS divided by the SS’s degrees of freedom. In the descriptive case, that is, in cases in which only the variance of given values is of interest (and no inferences on the variance of values outside the given data set are needed), the SS’s degrees of freedom are simply the number of values. Thus, descriptive variance is defined as the SS divided by the number of values. Decomposing sum of squares ...

  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles