If you studied psychology years ago, you may remember BF Skinner and his work during the 20th century on operant conditioning. Skinner studied whether behavior increased or decreased based on how often, and in what manner, you provide a “reinforcement” (reward).
What the casinos know — Let’s say you put a rat in a cage with a bar. If the rat presses the bar he gets a food pellet. The food pellet is called the reinforcement. But what if you set it up so that the rat does not get the food pellet every time he presses the bar. Skinner tested out various scenarios, and found that how often you give the food pellet, and whether you give it based on time or bar presses, affected how often the rat would press the bar. Here’s a synopsis of the different schedules:
Interval Schedules – You provide a food pellet after a certain interval of time has passed, for example, 5 minutes. The first time the rat presses the bar after 5 minutes is up, then he gets a food pellet.
Ratio Schedules – Instead of basing the reinforcement on time, instead you base it on the number of bar presses. For example, you provide a food pellet after every 10 bar presses.
There’s another twist — You can have fixed or variable variations on each of the above. If it’s a fixed schedule then you keep the same interval or ratio, for example, every 5 minutes or every 10 presses. If it’s variable then you vary the time or ratio, but it averages out, for example, sometimes you provide the reinforcement after 2 minutes, sometimes after 8 minutes, but it averages out to 5 minutes.
So altogether there are four possible schedules:
- Fixed Interval – Reinforcement is based on time and the time is always the same interval
- Variable Interval – Reinforcement is based on time, the amount of time varies, but it averages to a particular time.
- Fixed Ratio – Reinforcement is based on the number of bar presses, and the number is always the same.
- Variable Ratio – Reinforcement is based on the number of bar presses, the number varies, but it averages to a particular ratio.
It turns out that rats (and people too) will behave in predictable ways based on which schedule you are using.
You can predict — how often someone will engage in a certain behavior based on the way they are getting reinforced or rewarded. If you want someone to engage in a certain behavior the most, then you would use a variable ratio schedule.
If you’ve ever been to Las Vegas, — then chances are you’ve seen a variable ratio schedule in operation. You put your money in the slot machine and press the button. You don’t know how often you’ll win. It’s not based on time, but is based on the number of times you play. And it’s not fixed, it’s a variable schedule. It’s not predictable. You aren’t sure when you are going to win, but you know that your odds of winning increase the more times you play. So it will result in you playing the most, and the casino making the most money.
What do you think? How have you used these ideas of operant conditioning (whether you knew what to call them or not)?