The purpose of using a variable ratio schedule of reinforcement is to

In operant conditioning, a variable-ratio schedule is a partial schedule of reinforcement in which a response is reinforced after an unpredictable number of responses. This schedule creates a steady, high rate of response. Gambling and lottery games are good examples of a reward based on a variable-ratio schedule.

What Is a Variable-Ratio Schedule?

The American Psychological Association defines a variable-ratio schedule as "a type of intermittent reinforcement in which a response is reinforced after a variable number of responses."

Schedules of reinforcement play a central role in the operant conditioning process. The frequency with which a behavior is reinforced can help determine how quickly a response is learned as well as how strong the response might be. Each schedule of reinforcement has its own unique set of characteristics.

The purpose of using a variable ratio schedule of reinforcement is to

Illustration by Brianna Gilmartin, Verywell

Characteristics of Variable-Ratio Schedules 

There are three common characteristics of a variable-ratio schedule. They are:

  • Rewards are provided after an unpredictable number of responses: There is no predictability as to when a reward will be received. It might be after the first response or the fifth, or another number entirely.
  • Leads to a high, steady response rate: When the subject doesn't know when the reward will be given, they will continue to respond each time in the hopes that it will be the one response that results in a reward.
  • Results in only a brief pause after reinforcement: After the reinforcement is received in a variable-ratio schedule, there is just a minor pause in response. This is similar to a variable-interval schedule, in which the post-reinforcement pause is also brief.

How to Identify a Variable-Ratio Schedule

When identifying different schedules of reinforcement, it can be helpful to start by looking at the name of the individual schedule itself. In the case of variable-ratio schedules, the term "variable" indicates that reinforcement is delivered after an unpredictable number of responses. "Ratio" suggests that the reinforcement is given after a set number of responses. Together, the term means that reinforcement is delivered after a varied number of responses.

It might also be helpful to contrast the variable-ratio schedule of reinforcement with the fixed-ratio schedule of reinforcement. In a fixed-ratio schedule, reinforcement is provided after a set number of responses.

For example, in a variable-ratio schedule with a VR 5 schedule, an animal might receive a reward for every five responses, on average. One time, the reward would come after three responses, then seven responses, then five responses, and so on. The reinforcement schedule will average out to be rewarded for every five responses, but the actual delivery schedule will remain unpredictable.

In a fixed-ratio schedule, on the other hand, the reinforcement schedule might be set at a FR 5. This would mean that for every five responses, a reward is presented. Where the variable-ratio schedule is unpredictable, the fixed-ratio schedule is set and predictable.

Variable-Ratio Schedule

  • Reinforcement provided after a varying number of responses

  • Delivery schedule unpredictable

  • Examples include slot machines, door-to-door sales, video games

Fixed-Ratio Schedule

  • Reinforcement provided after a set number of responses

  • Delivery schedule predictable

  • Examples include production line work, grade card rewards, sales commissions

Variable-Ratio Schedule Examples

What does variable-ratio reinforcement look like in a real-world setting? Here are a few examples to consider.

  • Classroom learning: A variable-ratio schedule can be used in the classroom to help students learn. Because students won't know exactly when they will be rewarded for doing their homework, for instance, they may be more inclined to turn in all of the required assignments.
  • Slot machines: Players have no way of knowing how many times they must play before they win. All they know is that, eventually, a play will win. This is why slot machines are so effective and players are often reluctant to quit. There is always the possibility that the next coin they put in will be the winning one.
  • Social media: There are two ways that a variable-ratio schedule appears in social media. One, when you go into your social media accounts, you never know if you'll find any notifications, comments, or likes. Yet, you keep going back to check it to see if anything shows up. Along similar lines, you also never know what is going to show up in your news feed, but you still keep strolling to find posts you like.
  • Sales bonuses:Call centers often offer random bonuses to employees. Workers never know how many calls they need to make to receive the bonus, but they know that they increase their chances with more calls or sales.
  • Door-to-door sales:In this variable ratio example, the salesperson travels from house to house, but never knows when they will find an interested buyer. It could be the next house, or it might take multiple stops to find a new customer.
  • Video games: In some games, players collect tokens or other items in order to receive a reward or reach the next level. The player may not know how many tokens they need to receive a reward or even what that reward will be.

The purpose of using a variable ratio schedule of reinforcement is to

By Kendra Cherry
Kendra Cherry, MS, is an author and educational consultant focused on helping students learn about psychology.

Thanks for your feedback!

What is a variable ratio schedule of reinforcement quizlet?

In operant conditioning, a variable-ratio schedule is a schedule of reinforcement where a response is reinforced after an unpredictable number of responses. This schedule creates a steady, high rate of responding. Gambling and lottery games are good examples of a reward based on a variable ratio schedule.

What is the main purpose of the schedule of reinforcement?

Schedules of reinforcement are the rules that control the timing and frequency of reinforcer delivery to increase the likelihood a target behavior will happen again, strengthen or continue.

How does a variable ratio schedule work?

In operant conditioning, a variable-ratio schedule is a partial schedule of reinforcement in which a response is reinforced after an unpredictable number of responses. 1 This schedule creates a steady, high rate of response. Gambling and lottery games are good examples of a reward based on a variable-ratio schedule.

What is a variable interval ratio schedule?

Variable Interval (VI) Schedule. Interval schedules involve reinforcement of a target behavior after an interval of time has passed. In a variable interval schedule, the interval of time is not always the same but centers around some average.