The Weber Fraction calculator computes the Weber Fraction ratio (k) of just-noticeable difference (ΔI) and the intensity of a stimulus (I).
INSTRUCTIONS: Enter the following:
- (ΔI) Just noticeable Difference for Intensity
- (I) Base Level of Stimulus Intensity.
Weber Fraction (k): The calculator returns the ratio as a real number. However, this can be automatically converted to a percent via the pull-down menu
The Math / Science
The Weber Fraction represents that the ratio of the just-noticeable difference (JND) and the intensity of a stimulus should remain about the same for a wide range of intensities. The formula for the Weber Fraction is:
k = ΔI / I
where:
- k = Weber Fraction
- ΔI = just noticeable difference of intensity of stimulus
- I = base level of stimulus intensity
For example, imagine that a weight is 100 kg. The JND for the weight is 5 kg, meaning that someone lifting the weight can tell that 105 kg is slightly heavier than 100 kg. The constant is determined by dividing the JND by the intensity level of the stimulus, so 5/100 = 0.05.
This value of k = 0.05 can be used for all other weights, as long as they are not on the extreme ends of the range. For example, doubling the weight from 100 kg to 200 kg should also anticipate doubling the JND, because the constant k remains the same: .05 * 200 = 10, so someone lifting the weight should be able to tell the difference between 210 kg and 200 kg.
The constant k should generally remain the same within a modality (light, sound, tactile pressure, etc.) but may be different across modalities. Humans should be able to reliably and consistently detect the change if it is proportionally equal to Weber's fraction.
The Psychology and Statistics Calculator contains useful tools for Psychology Students. The psychology statistics functions include the following:
- Wilcoxon Signed Rank Test: Enter two sets, whether it's a one or two tail test and an alpha value to see the Wilcoxon statistic and the critical value.
- Bayes' Theorem for Disease Testing: Enter a base rate probability, probability of false positives and the probability of correct positives to see a ratio of people with the disease, approximate number of false and true positives and the theorem's percent likelihood of a having the disease if tested positive.
- chi-square Test: Enter a 3x2 matrix to see the expected values matrix with row and column totals, degrees of freedom and the chi-square value.
- Rescorla-Wagner Formula (alpha and beta version): Enter salience for conditional stimuli, rate of unconditional stimuli, maximum conditioning for unconditioned stimuli and the total associative strength of all stimuli present to see the change in strength between conditional and unconditional stimuli.
- Rescorla-Wagner Formula (k version): Enter Maximum conditioning possible for the unconditioned stimuli, total associative strength of all stimuli present, combined salience of the conditioned and unconditioned stimuli, and number of trials to see the change in strength associated with the trials.
- Ricco's Law: Enter the area of visually unresolved target and constant of background luminance when eyes are adapted to see Ricco's Law factor.
- Ricco's Law (K variable): Enter the scotopic vision constant, background luminance and photopic vision constant.
- Stevens' Power Law: Enter proportionality constant, magnitude of stimulation, type of stimulation exponent to see magnitude of sensation.
- Weber Fraction: Enter just-noticeable difference for intensity and stimulus intensity to see the weber fraction.
- Weber-Fechner's Law: Enter just-noticeable difference for intensity, instantaneous stimulus, stimulus intensity and the threshold to see the factor.
- Random Integer: This provides a random number (integer) between a lower and upper bound.
- Observational Statistics (aka Simple Stats): Observational statistics on a set including: count, min, max, mean, median, mode, mid-point, range, population and sample variance and standard deviation, mean absolute deviation, standard deviation of mean, sum of values, sum of squared values, square of the sum, and the sorted set.
- Frequency Distribution: Frequency distribution of a set of observations in uniformly sized bins between a minimum and maximum.
- Least-squares Trend Line (aka Linear Regression): Linear regression line on a set of paired numbers and see (r) the correlation coefficient,(n) number of observations, (μX) mean of the X values, (μY) mean of Y values, (ΣX) sum of the X values, (ΣY) sum of the Y values, (Σ(X⋅Y) ) sum of the X*Y product values, (ΣX2) sum of X2 values, (ΣY2) sum of Y2 values, (a) y intercept of regression line, and (b) slope of regression line.
- Single-Sample t-test: t-Test parameters including alpha level, population mean and whether it's one or two tailed and see the degrees of freedom, critical t-value, t score and the standard error.
- Paired Sample t-test: Test of two sets of values with an alpha level and whether it's one or two tailed and see the number of observations, mean and standard deviation for both sets, the degrees of freedom, critical t-value, t-score and the Standard Error value.
- Effect Size (r-squared): Enter a t-test result and the degrees of freedom to see r2.
- Effect Size (Cohen's d): Enter the mean from two groups and the estimated standard deviation to see the effective size.
- Analysis of Variance (one way): ANOVA for numeric observations of three groups. Computes the F Score, Numerator: degrees of freedom Between, Denominator: degrees of freedom Within, mean of each group, grand mean, total sum of squares, sum of square within and between, and variance within and between.