Tags | |
UUID | 13fd6acb-0dbd-11e4-b7aa-bc764e2038f2 |
In probability theory, Chebyshev's inequality (also spelled as Tchebysheff's inequality) guarantees that in any probability distribution, nearly all values are close to the mean. The precise statement is that no more than of the distribution's values can be more than k standard deviations away from the mean (or equivalently, at least of the distribution's values are within k standard deviations of the mean). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics.
The inequality has great utility because it can be applied to completely arbitrary distributions (unknown except for mean and variance), for example it can be used to prove the weak law of large numbers.
The formula, , shows that the ratio of the variance to a variable k-squared is always greater than equal to the probability the difference between a test measurement and the mean is greater than the value k.
There are two variants of the formula that are equally valuable. This is the first of the two which is denoted in the name of the equation.
The difference being where k is denoted.
No comments |