100 Things You Should Know About People: #75 — The More Difficult Something Is To Attain, The More People Like It

Picture of man climbing up a rock face
Photo by T. Voekler

You’ve heard about fraternities that have difficult initiation rituals to get in. The idea is that if an organization is hard to get into, then the people in it like it even more than if entry was not so difficult.

More difficult = more like — The first research on this initiation effect was done by Elliott Aronson at Stanford University in 1959. Aronson set up three initiation scenarios (severe, medium and mild, although the severe was not really that severe) and randomly assigned people to the conditions. He did indeed find that the more difficult the initiation, the more people liked the group.

Cognitive dissonance theory — Leon Festinger was the social psychologist who developed the idea of cognitive dissonance theory, and Aronson uses the theory to explain why people like groups that they had to endure hardship to join. People go through this painful experience only to find themselves part of a group that is not all that exciting or interesting. But that sets up a conflict (dissonance) in their thought process – if it’s boring and uninteresting, why did I submit myself to pain and hardship? In order to reduce the dissonance then, you therefore decide that the group is really important and worthwhile. Then it makes sense that you were willing to go through all of that pain.

Scarcity and exclusivity — In addition to the theory of cognitive dissonance to explain this phenomenon, I also think scarcity comes into play. If it’s difficult to join the group then not very many people can do it. I might not be able to make it in, then I would lose out. So if I went through a lot of pain it must be good.

What do you think? Do you find you like things better if they are difficult? Does this mean we should design products that are hard to use so that people will decide in the end it was worth it? (I hope not!)

And for those of you who like to read OLD research:

Aronson, Elliot, & Mills, J. (1959). The Effect of Severity of Initiation On Liking For A Group. U.S. Army Leadership Human Research Unit.

Festinger, L., Riecken, H.W., & Schachter, S. (1956). When Prophecy Fails. Minneapolis: University of Minnesota Press.

 

 

 

100 Things You Should Know About People: #46: The more uncertain you are, the more you dig in and defend your ideas

Picture of mac vs. pc adI’m one of these staunch Apple converts. For as long as there were PCs, I used to be a Windows/PC person. (Realize that I go all the way back to when PCs first came out. I used to sell a marvelous “portable” PC that ran on CPM operating system and had TWO (count ’em) TWO 360 KB (yes, I said KB) “floppy” disk drives (in other words NO hard drive.)) I was a PC person, NOT an Apple person. Apples were for teachers and then later, for artsy people. That was not me.

Fast forward to today and I will be talking on my iPhone, charging my Nano for my afternoon exercise, and transferring a movie to my ipad from my MacBook Pro. What the heck happened here?! — (that’s another story altogether).

Don’t show me the Android phone — So you might be able to guess what happened when I went to dinner with a colleague who was showing me his Android phone. He loves his new Android phone and wanted to show me all the great ways it was as good as, or better than, my iPhone. I was totally uninterested in hearing about it. I didn’t even want to look at it. Basically, I didn’t want to allow into my brain any information that would conflict with my opinion that anything besides an iPhone was even a possibility. I was showing classical symptoms of cognitive dissonance denial.

Alter your beliefs or deny the information? — In 1956 Leon Festinger wrote a book called When Prophecy Fails. In it he describes the idea of cognitive dissonance. Cognitive dissonance is the uncomfortable feeling we get when we have 2 ideas that conflict with each other. We don’t like the feeling, and we will therefore try to get rid of the dissonance. There are two main ways we can do that: change our belief, or deny one of the ideas.

When forced you’ll change your belief — In the original research on cognitive dissonance, people were forced to defend an opinion that they did not believe in. The result was actually that people tended to change their belief to fit the new idea.

Watching cognitive dissonance via an fMRI scan — In new research by Van Veen, researchers had people “argue” that the fMRI scan experience was pleasant (it’s not). When “forced” to make statements that the experience was pleasant, certain parts of the brain would light up (the dorsal anterior cingulate cortex and the anterior insular cortex.) The more these regions were activated, the more the participant would claim that they really did think the fMRI was pleasant.

When not forced you’ll dig in — But there’s another reaction that sometimes occurs. If you are not forced to state that you believe something you don’t, if instead you are presented with information that opposes your beliefs, but not forced to espouse a new belief, then the tendency is to deny the new information instead of changing your belief to fit.

When made to feel uncertain, you will argue harder — Gal and Rucker recently conducted research where they used framing techniques to make people feel uncertain. (For example, they told one group to remember a time when they were full of certainty, and the other group to remember a time when they were full of doubt). They then asked the participants whether they were meat-eaters, vegetarians, vegans, etc, how important this was to them, and how confident they were in their opinions.  People who were asked to remember times when they were uncertain, were less confident of their eating choices. However, when asked to write up their beliefs to persuade someone else to eat the way they did, they would write more and stronger arguments than the group that were certain of their choice.  They performed the research with different topics (for example the MAC/PC distinction) and found similar results. When people were less certain, then they would dig in and argue even harder.

I’m still trying to digest this latest research. What does this mean? If we want someone to be loyal and to be an advocate then we should actually give them a reason to be uncertain about the product? What do you think?

And for those of you who like to read the research:

Festinger, L., Riecken, H.W., & Schachter, S. (1956). When prophecy fails. Minneapolis: University of Minnesota Press.

Gal, David, and Rucker, Derek, When in doubt, shout. Psychological Science, October 13, 2010

Van Veen, V., Krug, M.K., Schooler, J.W., & Carter, C.S. (2009). Neural activity predicts attitude change in cognitive dissonance. Nature Neuroscience, 12(11), 1469–1474.

———————————

Did you find this post interesting? If you did, please consider doing one or more of the following:

add your comment
subscribe to the blog via RSS or email
sign up for the Brain Lady newsletter
share this post