Quantcast

Chebyshev's Inequality(2)

vCalc Reviewed
Last modified by
on
Jul 24, 2020, 6:28:07 PM
Created by
on
Jul 17, 2014, 2:19:59 PM
Pr(|X-μ|kσ)=1k2
Any real number multiple of the standard deviation
Tags
UUID
6f21962a-0dbd-11e4-b7aa-bc764e2038f2

In probability theory, Chebyshev's inequality (also spelled as Tchebysheff's inequality) guarantees that in any probability distribution, nearly all values are close to the mean. The precise statement being that no more than 1k2 of the distribution's values can be more than k standard deviations away from the mean (or equivalently, at least 1-1k2 of the distribution's values are within k standard deviations of the mean). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to completely arbitrary distributions (unknown except for mean and variance), for example it can be used to prove the weak law of large numbers.

P(|x-μ|kσ)1k2

So, this calculation tells you the probability that your data value, x, is further from the mean than the input value, k, times the standard deviation.  As we expect, the probability that your data value is further from the mean than some multiple of the standard deviation decreases with that increase in that multiple.  I.e, the Probability is lower that you value falls outside of six σ from the means than is the probability of falling outside one σ from the mean.

Notes

There are two variants of the formula that are equally valuable. This is the second of the two which is denoted in the name of the equation.

The difference being where k is denoted.


  • Comments
  • Attachments
  • Stats
No comments
This site uses cookies to give you the best, most relevant experience. By continuing to browse the site you are agreeing to our use of cookies.