In an increasingly data-driven world, it is more important than ever for students as well as professionals to better understand basic statistical concepts. 100 Questions (and Answers) About Statistics addresses the essential questions that students ask about statistics in a concise and accessible way. It is perfect for instructors, students, and practitioners as a supplement to more comprehensive materials, or as a desk reference with quick answers to the most frequently asked questions.

What Is the Standard Deviation, and How Is It Computed?

What Is the Standard Deviation, and How Is It Computed?

The standard deviation (represented by a lowercase s) is a measure of how much, on average, each score in a set of scores varies from the average (usually the mean) of that set of scores. One of several measures of variability, it is used to assess how much variability or diversity there is in any one set of scores.

It is computed by finding the average amount that each score deviates from the average of all the scores in the data set.

The formula is


Here’s an example using a very simple set of scores, which represent the number of correct words on a 20-item spelling test.

  • 16
  • 14
  • 10
  • 15
  • 14
  • 12
  • 19
  • 15
  • 8
  • 7

And, here are the steps for computing the standard deviation:

  • List ...
  • Loading...
locked icon

Sign in to access this content

Get a 30 day FREE TRIAL

  • Watch videos from a variety of sources bringing classroom topics to life
  • Read modern, diverse business cases
  • Explore hundreds of books and reference titles