Tags | |
UUID | 6f21962a-0dbd-11e4-b7aa-bc764e2038f2 |
In probability theory, Chebyshev's inequality (also spelled as Tchebysheff's inequality) guarantees that in any probability distribution, nearly all values are close to the mean. The precise statement being that no more than of the distribution's values can be more than k standard deviations away from the mean (or equivalently, at least of the distribution's values are within k standard deviations of the mean). The rule is often called Chebyshev's theorem, about the range of standard deviations around the mean, in statistics. The inequality has great utility because it can be applied to completely arbitrary distributions (unknown except for mean and variance), for example it can be used to prove the weak law of large numbers.
So, this calculation tells you the probability that your data value, x, is further from the mean than the input value, k, times the standard deviation. As we expect, the probability that your data value is further from the mean than some multiple of the standard deviation decreases with that increase in that multiple. I.e, the Probability is lower that you value falls outside of six from the means than is the probability of falling outside one from the mean.
There are two variants of the formula that are equally valuable. This is the second of the two which is denoted in the name of the equation.
The difference being where k is denoted.
No comments |