The Rescorla-Wagner Formula (alpha and beta version) calculator computes the change in strength on a single trial of the association between the conditioned stimuli and the unconditioned stimuli.
INSTRUCTIONS: Enter the following:
- (α) Salience of conditioned stimuli.
- (β) Rate parameter for unconditioned stimuli.
- (λ) Maximum conditioning possible for unconditioned stimuli.
- (V) Total associative strength of all stimuli present.
Change in Strength (ΔV): The calculator returns the value for change in the strength (ΔV), on a single trial, of the associate between the conditioned stimuli and the unconditioned stimuli.
Note: The parameters α and β are bound by zero and one. Therefore they can not be less than zero or greater than one.
Related Calculators
The Math / Science
The Rescorla-Wagner formula is:
ΔV=α⋅β⋅(λ−V)
where:
- ΔV = Change in strength
- α = Salience for the Conditional Stimulus
- β = Rate parameter for the Unconditional Stimulus
- λ = Maximum possible associated strength for all the Conditional Stimulus
- V = Current associated strength
The Rescorla–Wagner model describes the changing strength of the relationship between a conditional stimulus (CS) and an unconditional stimulus (US) over a series of trials where the stimuli are paired. In the model, λ represents the actual occurrence of the US and V is the expectation based on the existing associations between the US and the stimuli that precede it. Consequently the difference between λ and V represents the amount of surprise an organism presumably experiences when the US occurs on a particular trial. The constants α and β can vary from 0 to 1 and represent the salience of the CS and US, respectively. The product of these constants represents the rate at which learning can occur. ΔV is the amount of change in associative strength that occurs on a trial. This calculator reports the value of ΔV for a particular trial. Adding ΔV to the starting value of ΔV gives the new value of V at the end of the trial.
Source
- Rescorla, R. A., & Wagner, A. R. (1972). A theory of Pavlovian conditioning: Variations in the effectiveness of reinforcement and nonreinforcement. In A. H. Black & W. F. Prokasy (Eds.), Classical conditioning II: Current research and theory (pp. 64-99). New York: Appleton-Century-Crofts. ==
See also
There is additional information presented on the Rescorla-Wagner Model HERE.
A simplified version of the Rescorla-Wagner Model is provided HERE.
The Psychology and Statistics Calculator contains useful tools for Psychology Students. The psychology statistics functions include the following:
- Wilcoxon Signed Rank Test: Enter two sets, whether it's a one or two tail test and an alpha value to see the Wilcoxon statistic and the critical value.
- Bayes' Theorem for Disease Testing: Enter a base rate probability, probability of false positives and the probability of correct positives to see a ratio of people with the disease, approximate number of false and true positives and the theorem's percent likelihood of a having the disease if tested positive.
- chi-square Test: Enter a 3x2 matrix to see the expected values matrix with row and column totals, degrees of freedom and the chi-square value.
- Rescorla-Wagner Formula (alpha and beta version): Enter salience for conditional stimuli, rate of unconditional stimuli, maximum conditioning for unconditioned stimuli and the total associative strength of all stimuli present to see the change in strength between conditional and unconditional stimuli.
- Rescorla-Wagner Formula (k version): Enter Maximum conditioning possible for the unconditioned stimuli, total associative strength of all stimuli present, combined salience of the conditioned and unconditioned stimuli, and number of trials to see the change in strength associated with the trials.
- Ricco's Law: Enter the area of visually unresolved target and constant of background luminance when eyes are adapted to see Ricco's Law factor.
- Ricco's Law (K variable): Enter the scotopic vision constant, background luminance and photopic vision constant.
- Stevens' Power Law: Enter proportionality constant, magnitude of stimulation, type of stimulation exponent to see magnitude of sensation.
- Weber Fraction: Enter just-noticeable difference for intensity and stimulus intensity to see the weber fraction.
- Weber-Fechner's Law: Enter just-noticeable difference for intensity, instantaneous stimulus, stimulus intensity and the threshold to see the factor.
- Random Integer: This provides a random number (integer) between a lower and upper bound.
- Observational Statistics (aka Simple Stats): Observational statistics on a set including: count, min, max, mean, median, mode, mid-point, range, population and sample variance and standard deviation, mean absolute deviation, standard deviation of mean, sum of values, sum of squared values, square of the sum, and the sorted set.
- Frequency Distribution: Frequency distribution of a set of observations in uniformly sized bins between a minimum and maximum.
- Least-squares Trend Line (aka Linear Regression): Linear regression line on a set of paired numbers and see (r) the correlation coefficient,(n) number of observations, (μX) mean of the X values, (μY) mean of Y values, (ΣX) sum of the X values, (ΣY) sum of the Y values, (Σ(X⋅Y) ) sum of the X*Y product values, (ΣX2) sum of X2 values, (ΣY2) sum of Y2 values, (a) y intercept of regression line, and (b) slope of regression line.
- Single-Sample t-test: t-Test parameters including alpha level, population mean and whether it's one or two tailed and see the degrees of freedom, critical t-value, t score and the standard error.
- Paired Sample t-test: Test of two sets of values with an alpha level and whether it's one or two tailed and see the number of observations, mean and standard deviation for both sets, the degrees of freedom, critical t-value, t-score and the Standard Error value.
- Effect Size (r-squared): Enter a t-test result and the degrees of freedom to see r2.
- Effect Size (Cohen's d): Enter the mean from two groups and the estimated standard deviation to see the effective size.
- Analysis of Variance (one way): ANOVA for numeric observations of three groups. Computes the F Score, Numerator: degrees of freedom Between, Denominator: degrees of freedom Within, mean of each group, grand mean, total sum of squares, sum of square within and between, and variance within and between.