Society/Culture The Backfire Effect - And why it isn't as common as people think

Remove this Banner Ad

I like the You Are Not So Smart podcast. It's one of those ones you feel smarter after listening to it. Then you forget most of it.

First podcast:



We don’t treat all of our beliefs the same.

The research shows that when a strong-yet-erroneous, belief is challenged, yes, you might experience some temporary weakening of your convictions, some softening of your certainty, but most people rebound from that and not only reassert their original belief at its original strength, but go beyond that and dig in their heels, deepening their resolve over the long run.

Psychologists call this the backfire effect, and this episode is the first of three shows exploring this well-documented and much-studied psychological phenomenon, one that you’ve likely encountered quite a bit lately.

In this episode, we explore its neurological underpinning as two neuroscientists at the University of Southern California’s Brain and Creativity Institute explain how their latest research sheds new light on how the brain reacts when its deepest beliefs are challenged.

Second podcast:



If you try to correct someone who you know is wrong, you run the risk of alarming their brains to a sort-of existential, epistemic threat, and if you do that, when that person expends effortful thinking to escape, that effort can strengthen their beliefs instead of weakening them.

In this episode you'll hear from three experts who explain why trying to correct misinformation can end up causing more harm than good.

Third podcast:



If dumping evidence into people’s laps often just makes their beliefs stronger, would we just be better off trying some other tactic, or does the truth ever win?

Do people ever come around, or are we causing more harm than good by leaning on facts instead of some other technique?

In this episode we learn from two scientists how to combat the backfire effect. One used an ingenious research method to identify the breaking point at which people stop resisting and begin accepting the fact that they might be wrong. The other literally wrote the instruction manual for avoiding the backfire effect and debunking myths using the latest psychological research into effective persuasive techniques.


So then what happened?

The backfire effect seemed to disappear when trying to repeat the experiments.

It's popularity as an explanation might even have been almost entirely the result of a complex question which was then simplified by journalists.


Fourth podcast:



Last year on this show, we did three episodes about the backfire effect, and by far, those episodes were the most popular we’ve ever done.

In fact, the famous web comic The Oatmeal turned them into a sort of special feature, and that comic of those episodes was shared on Facebook a gazillion times, which lead to a stories about the comic in popular media, and then more people listened to the shows, on and on it went. You can go see it at The Oatmeal right now at the top of their page. It’s titled, you are not going to believe what I am about to tell you.

The popularity of the backfire effect extends into academia. The original paper has been cited hundreds of times, and there have been more than 300 articles written about it since it first came out.

The backfire effect has his special allure to it, because, on the surface, it seems to explain something we’ve all experienced -- when we argue with people who believe differently than us, who see the world through a different ideological lens -- they often resist our views, refuse to accept our way of seeing things, and it often seems like we do more harm than good, because they walk away seemingly more entrenched in their beliefs than before the argument began.

But…since those shows last year, researchers have produced a series new studies into the backfire effect that complicate things. Yes, we are observing something here, and yes we are calling it the backfire effect, but everything is not exactly as it seems, and so I thought we should invite these new researchers on the show and add a fourth episode to the backfire effect series based on what they’ve found. And this is that episode.


SO?

It seems that we tend not to change our attitudes, and there is a potential for backfire in some rare cases.

New information can be absorbed. Corrections to factual inaccuracies can change our factual beliefs.

But will these new beliefs change our behaviour? Not in the way they would if we were perfectly rational beings.

Man as the rationalising animal.

The roadmap for development of our attitudes, it seems, goes a little like:

Regional cultural norms --through normative influence--> Tell us what we should or should not be doing --Informs--> Values and Attitudes and Opinions --Which we protect and justify with--> Factual Beliefs --Through--> Cherry picking facts and evidence

Educating a person about specific, erroneous factual beliefs does not have enough force, usually, to go back up that chain and change values and attitudes and behaviour.

So, people can change their false factual beliefs. But persuasion is hard, so "the backfire effect" is blamed for failed persuasion.

Values and opinions
relating to identity, self, and tribe are particularly stubborn to shift.

Trump supporters are capable of agreeing that Trump lied about a specific fact, such as unemployment in some states (he said it was rising, when it was falling), but won't change their attitude toward Trump, instead finding other factual beliefs to protect and justify the attitude.

The anti-vaxxer will still tend not to vaccinate their kids, even though they know that there is no connection to autism. In fact in the vaccination study, the backfire wasn't regarding beliefs, it was about attitudes. The backfire still happened - intention to vaccinate dropped from 70% to 45%.


Factual beliefs, it seems, are cheap and expendable when protecting our opinions, specially when that opinion is formed out of a regional, self-identity value.


Or at least that is what I got out of this series.
 
Last edited:
Comedy is probably about the most helpful or effective strategy, or perhaps at least to start the process of opening the mind. If you can laugh you can learn.
 
I like the You Are Not So Smart podcast. It's one of those ones you feel smarter after listening to it. Then you forget most of it.

First podcast:



We don’t treat all of our beliefs the same.

The research shows that when a strong-yet-erroneous, belief is challenged, yes, you might experience some temporary weakening of your convictions, some softening of your certainty, but most people rebound from that and not only reassert their original belief at its original strength, but go beyond that and dig in their heels, deepening their resolve over the long run.

Psychologists call this the backfire effect, and this episode is the first of three shows exploring this well-documented and much-studied psychological phenomenon, one that you’ve likely encountered quite a bit lately.

In this episode, we explore its neurological underpinning as two neuroscientists at the University of Southern California’s Brain and Creativity Institute explain how their latest research sheds new light on how the brain reacts when its deepest beliefs are challenged.

Second podcast:



If you try to correct someone who you know is wrong, you run the risk of alarming their brains to a sort-of existential, epistemic threat, and if you do that, when that person expends effortful thinking to escape, that effort can strengthen their beliefs instead of weakening them.

In this episode you'll hear from three experts who explain why trying to correct misinformation can end up causing more harm than good.

Third podcast:



If dumping evidence into people’s laps often just makes their beliefs stronger, would we just be better off trying some other tactic, or does the truth ever win?

Do people ever come around, or are we causing more harm than good by leaning on facts instead of some other technique?

In this episode we learn from two scientists how to combat the backfire effect. One used an ingenious research method to identify the breaking point at which people stop resisting and begin accepting the fact that they might be wrong. The other literally wrote the instruction manual for avoiding the backfire effect and debunking myths using the latest psychological research into effective persuasive techniques.


So then what happened?

The backfire effect seemed to disappear when trying to repeat the experiments.

It's popularity as an explanation might even have been almost entirely the result of a complex question which was then simplified by journalists.


Fourth podcast:



Last year on this show, we did three episodes about the backfire effect, and by far, those episodes were the most popular we’ve ever done.

In fact, the famous web comic The Oatmeal turned them into a sort of special feature, and that comic of those episodes was shared on Facebook a gazillion times, which lead to a stories about the comic in popular media, and then more people listened to the shows, on and on it went. You can go see it at The Oatmeal right now at the top of their page. It’s titled, you are not going to believe what I am about to tell you.

The popularity of the backfire effect extends into academia. The original paper has been cited hundreds of times, and there have been more than 300 articles written about it since it first came out.

The backfire effect has his special allure to it, because, on the surface, it seems to explain something we’ve all experienced -- when we argue with people who believe differently than us, who see the world through a different ideological lens -- they often resist our views, refuse to accept our way of seeing things, and it often seems like we do more harm than good, because they walk away seemingly more entrenched in their beliefs than before the argument began.

But…since those shows last year, researchers have produced a series new studies into the backfire effect that complicate things. Yes, we are observing something here, and yes we are calling it the backfire effect, but everything is not exactly as it seems, and so I thought we should invite these new researchers on the show and add a fourth episode to the backfire effect series based on what they’ve found. And this is that episode.


SO?

It seems that we tend not to change our attitudes, and there is a potential for backfire in some rare cases.

New information can be absorbed. Corrections to factual inaccuracies can change our factual beliefs.

But will these new beliefs change our behaviour? Not in the way they would if we were perfectly rational beings.

Man as the rationalising animal.

The roadmap for development of our attitudes, it seems, goes a little like:

Regional cultural norms --through normative influence--> Tell us what we should or should not be doing --Informs--> Values and Attitudes and Opinions --Which we protect and justify with--> Factual Beliefs --Through--> Cherry picking facts and evidence

Educating a person about specific, erroneous factual beliefs does not have enough force, usually, to go back up that chain and change values and attitudes and behaviour.

So, people can change their false factual beliefs. But persuasion is hard, so "the backfire effect" is blamed for failed persuasion.

Values and opinions
relating to identity, self, and tribe are particularly stubborn to shift.

Trump supporters are capable of agreeing that Trump lied about a specific fact, such as unemployment in some states (he said it was rising, when it was falling), but won't change their attitude toward Trump, instead finding other factual beliefs to protect and justify the attitude.

The anti-vaxxer will still tend not to vaccinate their kids, even though they know that there is no connection to autism. In fact in the vaccination study, the backfire wasn't regarding beliefs, it was about attitudes. The backfire still happened - intention to vaccinate dropped from 70% to 45%.


Factual beliefs, it seems, are cheap and expendable when protecting our opinions, specially when that opinion is formed out of a regional, self-identity value.


Or at least that is what I got out of this series.


This is the reason I have few friends, I’ve also found it’s not endearing to call another person a pleb while pointing out gross erroneous errors in their thinking and beliefs, the latter in which they have invested much energies and years of their life towards. For some reason they 1) take it personally and 2) hold even faster to their convictions
 

Log in to remove this ad.

This is the reason I have few friends, I’ve also found it’s not endearing to call another person a pleb while pointing out gross erroneous errors in their thinking and beliefs, the latter in which they have invested much energies and years of their life towards. For some reason they 1) take it personally and 2) hold even faster to their convictions

The same thing will happen to me to be sure. Suppose a person comes up to me and sez “Catholic priests aren’t just old farts who wear dresses and pretend they are expressing the wishes of a supernatural contradictory being but are actually expressing the wishes of a supernatural contradictory being” I’d likely still hold fast to my heathen beliefs.
 
Excessive self-regard: once under the spell, self-regard makes one think, “I’m a pretty smart person, and I’m unlikely to be fooled, so more likely this group really is correct.”
^^Your typical antifa hoodlum.

Haven't listened to the podcast, but I regularly run through the Charlie Munger cognitive biases list. To keep my brain in good nick, I often take positions contrary to personal belief. Online forums facilitate this practice. So you'll find me raging in favour of conspiracies I don't believe in 'back and to the left, back and to the left...' for example.

This exercise has held me in good stead. It is like playing chess with your dumb other self, and you can do it anonymously. You see all the fatal moves unfolding.

The SRP board is good in that its participants avoid every rule of persuasion there is, namely:

Avoid mockery, and
Undisciplined arguing, and
Remember, the goal is to persuade, not to spill blood.

And while it is fun to clash with the knuckleheads on the left, no one has ever changed their mind because they were demolished and humiliated in an argument.

People loathe a void. When they encounter gaps in the information they’re receiving (from an ad, from the news, from gossip, from any incoming stimuli at all) they have a stunning tendency to fill in those gaps with their own ideas.

Based on ill-formed intuition, soggy critical thinking, and flawed belief systems that defy reality. In other words: They just make it up as they go. - John Carlton

If you need to persuade people for a living, and you don't want your kids to starve, then follow these 3 simple rules:

Rule #1. Don't butt heads.
Rule #2. Ignore irrationality.
Rule #3. Take your ego out of it.

Facts are stupid things.” - Ronald Reagan, ’88 GOP convention
 

Remove this Banner Ad

Back
Top