Science

Pro-vaccine facts only strengthen doubter's misconceptions

Why do some people believe in certain things even when faced with an overwhelming volume of evidence to the contrary?
Why do some people believe in certain things even when faced with an overwhelming volume of evidence to the contrary?

Why do some people believe in certain things even when faced with an overwhelming volume of evidence indicating they are wrong? In a world labeled as "post-truth", where facts and scientific evidence have become subjective, some researchers are investigating how false beliefs are not only formed, but seemingly strengthened, when presented with contradictory evidence.

The backfire effect

In 2005 and 2006, two researchers from the University of Michigan and Georgia State University conducted a series of experiments investigating the efficacy of correcting a false belief. Subjects were given mock newspaper articles containing stories that reinforced a widespread misperception. Some were given only the false articles, while others were given the articles and then immediately given corrective information.

One of the articles, for example, suggested the US discovered weapons of mass destruction in Iraq, with some subjects receiving an article immediately afterwards correcting the false information and explaining that the US in fact never found weapons of mass destruction in the country. The initial results were unsurprising. People tended to believe the information that supported their preexisting beliefs. So if you supported the war then you would disbelieve the corrective second article, while if you had stronger liberal leanings you would disbelieve the first article.

These results were expected, but when the researchers further investigated the beliefs of their subjects following the experiments, they came across some strange responses. In some cases, the secondary corrective information actually reinforced, and strengthened, the original mistaken belief.

"Conservatives who received a correction telling them that Iraq did not have [weapons of mass destruction] were more likely to believe that Iraq had [weapons of mass destruction] than those in the control condition," explain the researchers.

This phenomenon was dubbed "the backfire effect" and in recent years it has felt like it has become an increasingly common scenario.

But is "the backfire effect" a real phenomenon?

The elusive backfire effect

A recent study tried to replicate the "backfire effect" and found it to be rare and elusive. The two researchers initially set out to prove the theory, believing it to obviously be a real thing, but they actually reached conclusions to the contrary.

Across four experiments spanning 8,100 subjects no factual evidence of the "backfire effect" could be found. The two researchers, from Ohio State University and George Washington University, published their results in an expansive journal entry entitled The Elusive Backfire Effect: Mass Attitudes' Steadfast Factual Adherence.

The study ultimately found that the power of facts to change people's minds is indeed still strong. The researcher's concluded, "By and large, citizens heed factual information, even when such information challenges their partisan and ideological commitments."

At this point having presented you, the reader, with two conflicting studies, we are very much sitting on the perch of our own little confirmation bias experiment. Which piece of information do you choose to believe? Which study is more anomalous?

Of course, you could explore more closely each individual study and try to gauge which you felt was more thorough in its methods, but let's face it – most of us don't do that.

One thing we can more confidently agree on at this point is that there are certain beliefs that tend to be perpetuated, despite overwhelming evidence to the contrary. A quick Google search of "flat-earth theory" or "autism resulting from vaccines" will show that no matter the weight of scientific evidence, there are still some people out there who remain adamant in their contradictory beliefs.

The dangers of corrective information

A new study from a team at the University of Edinburgh investigated the efficacy of a series of different strategies that could be used to correct a person's misinformed beliefs. The study focused on the misinformation circulating around the dangers of vaccinations. The findings not only discovered that factual information tended to reinforce opposing ill-founded beliefs, but also noted that the simple restating of false information appeared to amplify the spread of those misconceptions.

Two hundred and fifty-four participants were split into four groups, each offered a different corrective information strategy. The subject's opinions on vaccine safety were surveyed before the information was delivered, immediately after, and then one week later. One out of the four groups was the control and was provided unrelated health fact sheets.

The other three groups were delivered standard pro-vaccination information: a myth vs fact booklet, a series of tables contrasting the effects of measles against possible vaccine side effects or a fear-based booklet with pictures of sick, unvaccinated children suffering from measles, mumps or rubella.

The frightening results of the study found that all three information strategies were counter-productive, resulting in the strengthening of belief in vaccine myths. Interestingly, the "myth vs reality" strategy was found to be the most ineffective, inducing stronger beliefs in the link between vaccines and autism and increasing doubts over the general safety of vaccines.

When the "myth vs reality" participants were resurveyed one week later their misconceptions were found to be stronger than before undertaking the experiment in the first place. This indicated that the facts presented in the experiments were quickly forgotten, while the mere presence of the myths caused the original false belief to be amplified.

"These findings offer a useful example of how factual information is misremembered over time," says one of the authors on the study, Professor Sergio Della Sala. "Even after a short delay, facts fade from the memory, leaving behind the popular misconceptions."

As with any limited study, the results are not definitive, but it does suggest public health campaigns need to be very carefully considered. Simply stating false information, even when countering it with factual corrections, can seemingly give a life to, and amplify, those incorrect views.

But where does this leave us? Is there a "backfire effect"? Can we ever correct those who believe outright mistruths?

There are currently no easy answers to these questions, but maybe we'll all think twice next time we try to correct our friend on Facebook with our irrefutable "facts". Sometimes the "facts" are not enough.

The new study on vaccination misinformation was published in the journal PLOS ONE.

Source: University of Edinburgh

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
11 comments
Ratio01
I think in this article you have missed some very important information. First of all you call it the post truth era whereas I believe, for a large portion of the population would be called the era of Truth and discernment. However the missing information is that there has been so much proven deception in Media as well as manipulative facts concerning what's good for us and what isn't. Don't forget that people's beliefs have already been challenged and proven through their own experience. Misinformation by the government as well as product sellers has raised incredible doubt in the minds of people. Incredibly, people are learning to depend on their own thought process of deduction and reason. Right or wrong.
SkepticDude1
The study and the many articles published on it are a sadly distorted waste. I can write a "myth vs fact booklet" on the diet of bigfoot, and a table contrasting their vegetarian diet to their carnivorous diet. Providing such junk and pretending that you are providing scientific information to change someone's mind proves nothing more than people aren't likely to believe something just because someone writes it on a paper. After all everyone knows that the majority of studies show that bigfoot is actually omnivorous. So neither thesis nor antithesis is true, we need synthesis. :-)
Babaghan
What Ratio01 is trying to say is that he will research the facts himself and come to his own conclusions. No need to present the facts as they will have to opposite effect on him, just as the research studies suggest.
thurstjo63
LOL! First of all I'm interested to know as to what are considered "facts" from the pro vaccine side of the argument. You say the studied focused on misconceptions around the dangers of vaccination but give no examples. Instead you give an example about weapons of mass destruction in Iraq!?! I suspect you might be well aware that what you would present as a "fact" might not hold up to scrutiny which may explain why you chose not to present any in your article.
From everything I see in the society, these so called facts actually correspond to religious beliefs since there is often little comprehensive data and research to back them up. Especially since most people with half a brain, who if given a recommendation by a financial adviser to invest money in some investment would take the time to do the due diligence of the facts and assumptions in order to understand the investment and associated risks before investing are willing to allow someone in a white coat to pump all types of substances in the blood of their child without taking the time to do the due diligence of actually reading the label of the vaccine (which since there is a legal requirement for it to present the honest facts about the substances involved and its effects) is the logical starting point for any research around vaccines before allowing said person to pump it into the blood of your child!?! You couldn't have a more faith based act from someone who is not in a church or mosque. And you call people who are worried about vaccines illogical?!?
JPuddybuc
Reminds me of the allegory of the cave in Plato's Republic; which explains how reality is defined by actual experience. And the reality proposed by governments, corporations, and particularly the media is so often at odds with people's actual experiences - that nearly everything is viewed as propaganda and dismissed. If the media tells me that vaccines cause autism, followed by the government & corporate manufacturers telling me it doesn't, followed by my kid being diagnosed by some quack as autistic, followed by some manufacturer selling me some drug, followed by a new media blitzkrieg telling me vaccines DON'T cause autism... Yeah, I'm skeptical. And my reality is based on my experience.
MichaelCohn
It is surprising that you wrote this article and coined the term "backfire effect" without once referencing Cognitive Dissonance and the large body of research surrounding this concept. All of these studies predate the ones you referenced and provide a much more coherent and parsimonious explanation of the phenomena you discussed. All of this research is old and well-established as is the concept of Cognitive dissonance.
over_there
Every week a new "scientist" has new facts that contradict the facts from the week before often these facts are obviously rubbish and just the result of someone trying to make a story. This is why people completely disregard the facts/stories they are told.
JimFox
The study ultimately found that the power of facts to change people's minds is indeed still strong. The researcher's concluded, "By and large, citizens heed factual information, even when such information challenges their partisan and ideological commitments."
Which rarely applies to religious belief, in my experience. In fact, the more absurd the conviction, the more resolutely people will cling to it.
drspeg
"So if you supported the war then you would disbelieve the corrective second article, while if you had stronger liberal leanings you would disbelieve the first article." Aren't these both examples of the same thing (preference for the first article)? Was "disbelieve the first article" supposed to read, "disbelieve the corrective second article"?
Also (MichaelCohn), it is my understanding that cognitive dissonance refers to a change in belief due to the conflict that arises when one's behavior contradicts one's initial beliefs (attitudes/cognitions). Since one cannot ease the discomfort of dissonance by changing a behavior that has already occurred, the person can only modify their belief. It may be that you were thinking instead of something closer to McGuire's "inoculation effect/theory" from the 60's that covers how sometimes (especially when presented with weak arguments against a held position) people's initial beliefs are strengthened when presented with conflicting information (or argument).
apprenticeearthwiz
The polarity of this article is indicative of a thoroughly manipulated mainstream narrative. Facts versus myths. Everything provax is fact, everything questioning this is myth, full of rogues and charlatans preying on the weak minded. In fact it's not even worth looking at, don't look. There's this trick we use to determine questions. Examine all the available evidence before making any decision. It's called science. I thought this publication was a fan of science. In truth there are many well-credentialled scientists not necessarily concerned about vaccination per se but profoundly concerned with the constantly growing, massively lucrative vaccination schedule. There is certainly a strong case for universal vaccination for infectious diseases like polio which almost always results in deformity or death and for which there are no other remedies. Diseases like measles and mumps don't remotely fall into that category. Rarely are there complications and even more rarely in the absence of contributing conditions. Not only is there the potential for harm in these vaccinations there's evidence these diseases give our immune system a beneficial workout. It's difficult for provaxxers to believe the mainstream narrative could be manipulated by the pharmaceutical industry, or that the field of medicine could be manipulated by the makers of the medicines. The largest industry in the world is the largest amount of money, therefore influence, in the world. Wealth and influence have always, always, always manipulated the mainstream narrative. Who do you think owns the mainstream media?