I’m one of these staunch Apple converts. For as long as there were PCs, I used to be a Windows/PC person. (Realize that I go all the way back to when PCs first came out. I used to sell a marvelous “portable” PC that ran on CPM operating system and had TWO (count ’em) TWO 360 KB (yes, I said KB) “floppy” disk drives (in other words NO hard drive.)) I was a PC person, NOT an Apple person. Apples were for teachers and then later, for artsy people. That was not me.
Fast forward to today and I will be talking on my iPhone, charging my Nano for my afternoon exercise, and transferring a movie to my ipad from my MacBook Pro. What the heck happened here?! — (that’s another story altogether).
Don’t show me the Android phone — So you might be able to guess what happened when I went to dinner with a colleague who was showing me his Android phone. He loves his new Android phone and wanted to show me all the great ways it was as good as, or better than, my iPhone. I was totally uninterested in hearing about it. I didn’t even want to look at it. Basically, I didn’t want to allow into my brain any information that would conflict with my opinion that anything besides an iPhone was even a possibility. I was showing classical symptoms of cognitive dissonance denial.
Alter your beliefs or deny the information? — In 1956 Leon Festinger wrote a book called When Prophecy Fails. In it he describes the idea of cognitive dissonance. Cognitive dissonance is the uncomfortable feeling we get when we have 2 ideas that conflict with each other. We don’t like the feeling, and we will therefore try to get rid of the dissonance. There are two main ways we can do that: change our belief, or deny one of the ideas.
When forced you’ll change your belief — In the original research on cognitive dissonance, people were forced to defend an opinion that they did not believe in. The result was actually that people tended to change their belief to fit the new idea.
Watching cognitive dissonance via an fMRI scan — In new research by Van Veen, researchers had people “argue” that the fMRI scan experience was pleasant (it’s not). When “forced” to make statements that the experience was pleasant, certain parts of the brain would light up (the dorsal anterior cingulate cortex and the anterior insular cortex.) The more these regions were activated, the more the participant would claim that they really did think the fMRI was pleasant.
When not forced you’ll dig in — But there’s another reaction that sometimes occurs. If you are not forced to state that you believe something you don’t, if instead you are presented with information that opposes your beliefs, but not forced to espouse a new belief, then the tendency is to deny the new information instead of changing your belief to fit.
When made to feel uncertain, you will argue harder — Gal and Rucker recently conducted research where they used framing techniques to make people feel uncertain. (For example, they told one group to remember a time when they were full of certainty, and the other group to remember a time when they were full of doubt). They then asked the participants whether they were meat-eaters, vegetarians, vegans, etc, how important this was to them, and how confident they were in their opinions. People who were asked to remember times when they were uncertain, were less confident of their eating choices. However, when asked to write up their beliefs to persuade someone else to eat the way they did, they would write more and stronger arguments than the group that were certain of their choice. They performed the research with different topics (for example the MAC/PC distinction) and found similar results. When people were less certain, then they would dig in and argue even harder.
I’m still trying to digest this latest research. What does this mean? If we want someone to be loyal and to be an advocate then we should actually give them a reason to be uncertain about the product? What do you think?
And for those of you who like to read the research:
Festinger, L., Riecken, H.W., & Schachter, S. (1956). When prophecy fails. Minneapolis: University of Minnesota Press.
Gal, David, and Rucker, Derek, When in doubt, shout. Psychological Science, October 13, 2010
Van Veen, V., Krug, M.K., Schooler, J.W., & Carter, C.S. (2009). Neural activity predicts attitude change in cognitive dissonance. Nature Neuroscience, 12(11), 1469–1474.
Did you find this post interesting? If you did, please consider doing one or more of the following:
add your comment
subscribe to the blog via RSS or email
sign up for the Brain Lady newsletter
share this post
15 Replies to “100 Things You Should Know About People: #46: The more uncertain you are, the more you dig in and defend your ideas”
Could it be that you are conscious of being uncertain and therefore you are making double efforts to convince other people and yourself at the same time?
I’m thinking of impulse buyers as potential “victims” of that behaviour, as they may try to rationalize their deeds afterwards. What do you think?
(I’m currently reading neuro web design, it’s really fascinating)
A great article, as was your OMS presentation today. Those cookies really tasted better due to scarcity? So human…
Thank you, Gabriel
I agree with the first comment, that people argue harder as they are trying to convince themselves and the person they are talking to. In addition, I think this is compounded by a fear of failure in terms of knowing their own mind triggered when the doubt kicks in.
Nice overview of the under-rated cognitive dissonnace effects on human behaviour
in general and thus internet users’s behaviour.
I definitely go along with Régis’s hypothesis: the more uncertain you are, the more efforts you’ll have to put on to convince yourself.
Especially by digging into “rational” arguments. This rationnality believed to be able of convincing any normal person, except anyone having 2 brain hemispheres…
Anyhow, from this can be deduced the observed “strength” of the arguments delivered to other people we want to convince.
So, then how would one apply the result of this research to “hire advocates for your products”? Well… Don’t!
Emotion is stronger than Rationality (see how cute your iPhone is?).
Focus on effective arguments, thus on emotional stimuli.
This reminds me of the state people get into with religious arguments (whether literally or metaphorical). If I can prove I am right, them I don’t need to argue hard, I can just show some facts or research and others will listen. But where my argument stems from faith, I cannot prove I am right and others cannot prove I am wrong, so the ‘winner’ is whoever argues hardest.
I have to agree. Even though people may argue harder for a belief when feeling uncertain this could just be the rational brain fighting the unconscious brain. As you pointed out the unconscious brain is surprisingly powerful and does not want to loose. But the belief is probably already in danger of being abandoned because it doesn’t “feel” as right as it may have before even though this isn’t obvious because of all the rationalizing going on.
So giving people a reason to feel uncertain about a product is not the right way to go.
Resistance to change is one of the most common human reactions… it makes us feel “not OK” and we’ll do almost anything to avoid that. It really shows up in politics and religion, where certitude makes us feel the most OK. It’s why, in advertising, our job is to find out what they want and make it their idea to get it. A list of features and benefits is just about worthless. Don’t bother telling them how great your product is…show how great THEY will be with it.
I wonder what is the biological basis for this. Why would we feel compelled to fight harder for something less certain than more certain. Maybe as @Gabriel suggests, it has something to do with scarcity.
The possibility seems to be overlooked that an uncertain person may argue strenuously in favor of a particular position in hopes of getting an equally strenuous counter-argument. People who are uncertain want to be certain. Their desire for a convincing counter-argument may be stronger than their desire to be right in their initial position.
Very interesting post. I’ve even watched this effect happen with professional market researchers, who strive to be objective and rational. In the past, I’ve presented research findings which were counter-intuitive so many people fought against them. (the dotcom bust, many people pulling landlines, Apples being used and loved by anyone older than 24, high-income people moving to prepaid wireless, etc.) There seems to be some fear about being wrong or embarassed.
Your post is a great reminder about how to encourage openness and listening, instead of resistence and denial.
That is quite true.
There are more than two ways to deal with the dissonance: you may also gather more information about the conflicting beliefs and discover that they do not indeed conflict.
That would dissipate the dissonance.
Not all conflict must be either/or.
The apparent dissonance may actually be lovely harmony with more understanding.
Lack of understanding may
Do you think that how much a person feels invested in the thing, and what that investment means, contributes to this behavior?
In the way of material things, items such as iPhones and Androids can be fairly expensive for some. If I’ve spent (what I consider) a lot of money on one of these items, I might feel the need to maintain that I have not made a stupid decision, and I don’t want anyone to see me as lacking in knowledge or judgment.
In more deep seated issues like religion or politics, much of what we do in and with our lives is directed by our beliefs. Therefore, from an inner standpoint, any uncertainty I feel might infer that I am wrong, my BEING is wrong and that I let myself be duped. Will people now see me as gullible?
Belief is an area that is far too wide-ranging and deep-seated to be overturned easily. You might sell your iPhone on Ebay and buy a Droid, but religious beliefs and politics are some part of your cultural identity. When you change that, you rip some social fabric.
Could this be why religion is so contentious in an egalitarian democratic society?