Quantcast
Typesetting math: 100%

Chebyshev's Inequality(1)

vCalc Reviewed
Last modified by
on
Jul 24, 2020, 6:28:07 PM
Created by
on
Jul 17, 2014, 2:17:26 PM
P=σ2k2
Multiple of σ From Mean
Standard Deviation
Tags
UUID
13fd6acb-0dbd-11e4-b7aa-bc764e2038f2

In probability theory, Chebyshev's inequality (also spelled as Tchebysheff's inequality) guarantees that in any probability distribution, nearly all values are close to the mean. The precise statement is that no more than 1k2 of the distribution's values can be more than k standard deviations away from the mean (or equivalently, at least 1-1k2 of the distribution's values are within k standard deviations of the mean). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics.

The inequality has great utility because it can be applied to completely arbitrary distributions (unknown except for mean and variance), for example it can be used to prove the weak law of large numbers.

The formula, P(|x-μ|k)σ2k2, shows that the ratio of the variance to a variable k-squared is always greater than equal to the probability the difference between a test measurement and the mean is greater than the value k.

Notes

There are two variants of the formula that are equally valuable. This is the first of the two which is denoted in the name of the equation.

The difference being where k is denoted.


  • Comments
  • Attachments
  • Stats
No comments
This site uses cookies to give you the best, most relevant experience. By continuing to browse the site you are agreeing to our use of cookies.