Quantcast

Standard Deviation (SD)

`SD = sqrt((sum_(i=1)^N x_i^2)/N) = sqrt((sum_(i=1)^n(X_i-barX)^2)/N)`
`Data Set`
Tags

The Standard Deviation (SD or `sigma`) is a most common measure of variability in population or a data set. The standard deviation is used to quantify the amount of variation or dispersion of a set of data values.

What does the standard deviation tell us?

A low standard deviation indicates that the data points tend to be close to the mean.  

A high standard deviation indicates that the data is spread over a wider range of values.

The Math of the Standard Deviation

The standard deviation of a data set or statistical population is the square root of its variance. The standard deviation, unlike the variance, is expressed in the same units as the data.

The standard deviation (`sigma`)  is expressed here as the population SD:

`sigma = sqrt(sum_(i=1)^N(X_i - barX)N)`

This means we are calculating the SD on the whole of the data set, the whole population.

See Also