1. Education
Send to a Friend via Email

What Is a Variable-Ratio Schedule?

By

Variable ratio

Slot machines operate on a variable-ratio schedule of reinforcement.

Image by Jacqueline Munoz
Definition:

In operant conditioning, a variable-ratio schedule is a schedule of reinforcement where a response is reinforced after an unpredictable number of responses. This schedule creates a high steady rate of responding. Gambling and lottery games are good examples of a reward based on a variable ratio schedule.

Characteristics

  • Leads to a high, steady response rate
  • Results in only a brief pause after reinforcement

Examples

  • Slot machines: Players have no way of knowing how many times they have to play before they will win. All they know is that eventually a play will win. This is why slot machines are so effective and players are often reluctant to quit. There is always the possibility that the next coin they put in will be the winning one.

  • Sales bonuses: Call centers often offer random bonuses to employees. Workers never know how many calls they need to make in order to receive the bonus, but they know that they increase their chances the more calls or sales they make.

More Psychology Definitions: The Psychology Dictionary

Browse the Psychology Dictionary

A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z |

©2014 About.com. All rights reserved.