Definition

The standard deviation is one of the common measures of spread or variation in data. By definition, it is the 'typical' distance between each value of a given variable and the mean of the values. 


Computing the standard deviation

The formulae for calculating a standard deviation of a given set of values is shown below. The numerator is basically the sum of the deviations.


Interpreting the standard deviation

When comparing standard deviations of more than 1 set of values, a smaller standard deviation implies that there is less variation or spread amongst the values and vice-versa.

Approximating the standard deviation 

For many datasets, the standard deviation is approximately a quarter of the range. This rule is only approximate. The standard deviation can be more than a quarter of a range for a distribution with longer tails or outliers.


A more accurate method of approximating the standard deviation is the 70-95-100 rule of thumb which states that:-

  1. About 70 % of observations lie within 1 standard deviation
  2. About 95% of observations lie within 2 standard deviations
  3. Nearly all observations lie within 3 standard deviations

This rule holds well for datasets with approximately symmetric distribution. For skew distributions or datasets with outliers, it is less accurate.

Example and Solution


Contextual Application

In the analysis of marine heat waves, the use of standard deviation in defining the threshold sea surface temperature for heat wave events is not preferred because this may need to take into consideration of the distribution of anomalies. Anomalies are basically the deviations (distance between mean and each value) in temperatures.

Modifié le: mardi 6 février 2024, 11:04