Tags | |
UUID | efce4d60-e1db-11e3-b7aa-bc764e2038f2 |
[Mathematics | Probability | Statistics | Distribution] Chebyshev's inequality (also known as Tchebysheff's inequality) states that in any distribution almost all of the values are close to the mean. To be more accurate, the statement means that no more than of the distribution's values will be more than k standard deviations away from the mean.
The inequality has great utility because it can be applied to completely arbitrary distributions (unknown except for mean and variance), for example it can be used to prove the weak law of large numbers.
The formula, , shows that the ratio of the variance to a variable k-squared is always greater than equal to the probability the difference between a test measurement and the mean is greater than the value k.
Variables:
There are two variants of the formula that are equally valuable. This is the first of the two which is denoted in the name of the equation.
The difference being where k is denoted.
No comments |