arcade machine with lights turned on inside room

365 Ways To Persuade And Motivate: #7 Give Rewards Unpredictably To Sustain A Behavior

by

Posted

picture of slot machines on a casino floor

In the last blog post I wrote about using a continuous reinforcement schedule when you want to establish a new behavior. And I hinted that you should change that schedule after the behavior is established.

One of the reward “schedules” that BF Skinner researched is called a variable ratio schedule. It’s called “variable” because you don’t reward the behavior every time. You vary how often the person gets a reward when they do the target behavior. And it’s called “ratio” because you give a reward based on the number of times a person has done the behavior (rather than, for example, rewarding someone based on time – for example giving a reward the first time the person does the behavior after 5 minutes has elapsed).

In a variable ratio schedule you may decide that you are going to reward the behavior, on average, every 5 times the person does the behavior, but you vary it, so sometimes you give the reward the third time they do the behavior, sometimes the 7th time, sometimes the 2nd time, etc. It averages out to every 5 times.

Let’s take the example of trying to get your employee to turn in expense reports on time. At first you would reward them every time they turn in the expense report on time (as we discussed in the previous blog post on continuous reinforcement).

Once the behavior is established, however, you would then switch to only rewarding them every 3 or 5 or 7 times on average. This is the variable ratio schedule.

Skinner found that variable ratio schedules have two benefits:

a)    they result in the most instances of the behavior than any of the other schedules (i.e., people  will keep handing in the expense report on time), and

b)   they result in behaviors that Skinner said were “hard to extinguish”, which is “psychology speak” for the idea that the behavior persists over time, even when rewards aren’t being given any more.

If you want to see another example of a variable ratio schedule, go to a casino. Slot machines are a very effective example of a variable ratio schedule. The casinos have studied the science of rewards and they use them to get people to play and keep playing.

Can you think of any more variable ratio schedule examples that you’ve experienced or tried?

To learn more check out our 1 day seminar on The Science of Persuasion.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *