Science

Pro-vaccine facts only strengthen doubter's misconceptions

Pro-vaccine facts only strengthen doubter's misconceptions
Why do some people believe in certain things even when faced with an overwhelming volume of evidence to the contrary?
Why do some people believe in certain things even when faced with an overwhelming volume of evidence to the contrary?
View 1 Image
Why do some people believe in certain things even when faced with an overwhelming volume of evidence to the contrary?
1/1
Why do some people believe in certain things even when faced with an overwhelming volume of evidence to the contrary?

Why do some people believe in certain things even when faced with an overwhelming volume of evidence indicating they are wrong? In a world labeled as "post-truth", where facts and scientific evidence have become subjective, some researchers are investigating how false beliefs are not only formed, but seemingly strengthened, when presented with contradictory evidence.

The backfire effect

In 2005 and 2006, two researchers from the University of Michigan and Georgia State University conducted a series of experiments investigating the efficacy of correcting a false belief. Subjects were given mock newspaper articles containing stories that reinforced a widespread misperception. Some were given only the false articles, while others were given the articles and then immediately given corrective information.

One of the articles, for example, suggested the US discovered weapons of mass destruction in Iraq, with some subjects receiving an article immediately afterwards correcting the false information and explaining that the US in fact never found weapons of mass destruction in the country. The initial results were unsurprising. People tended to believe the information that supported their preexisting beliefs. So if you supported the war then you would disbelieve the corrective second article, while if you had stronger liberal leanings you would disbelieve the first article.

These results were expected, but when the researchers further investigated the beliefs of their subjects following the experiments, they came across some strange responses. In some cases, the secondary corrective information actually reinforced, and strengthened, the original mistaken belief.

"Conservatives who received a correction telling them that Iraq did not have [weapons of mass destruction] were more likely to believe that Iraq had [weapons of mass destruction] than those in the control condition," explain the researchers.

This phenomenon was dubbed "the backfire effect" and in recent years it has felt like it has become an increasingly common scenario.

But is "the backfire effect" a real phenomenon?

The elusive backfire effect

A recent study tried to replicate the "backfire effect" and found it to be rare and elusive. The two researchers initially set out to prove the theory, believing it to obviously be a real thing, but they actually reached conclusions to the contrary.

Across four experiments spanning 8,100 subjects no factual evidence of the "backfire effect" could be found. The two researchers, from Ohio State University and George Washington University, published their results in an expansive journal entry entitled The Elusive Backfire Effect: Mass Attitudes' Steadfast Factual Adherence.

The study ultimately found that the power of facts to change people's minds is indeed still strong. The researcher's concluded, "By and large, citizens heed factual information, even when such information challenges their partisan and ideological commitments."

At this point having presented you, the reader, with two conflicting studies, we are very much sitting on the perch of our own little confirmation bias experiment. Which piece of information do you choose to believe? Which study is more anomalous?

Of course, you could explore more closely each individual study and try to gauge which you felt was more thorough in its methods, but let's face it – most of us don't do that.

One thing we can more confidently agree on at this point is that there are certain beliefs that tend to be perpetuated, despite overwhelming evidence to the contrary. A quick Google search of "flat-earth theory" or "autism resulting from vaccines" will show that no matter the weight of scientific evidence, there are still some people out there who remain adamant in their contradictory beliefs.

The dangers of corrective information

A new study from a team at the University of Edinburgh investigated the efficacy of a series of different strategies that could be used to correct a person's misinformed beliefs. The study focused on the misinformation circulating around the dangers of vaccinations. The findings not only discovered that factual information tended to reinforce opposing ill-founded beliefs, but also noted that the simple restating of false information appeared to amplify the spread of those misconceptions.

Two hundred and fifty-four participants were split into four groups, each offered a different corrective information strategy. The subject's opinions on vaccine safety were surveyed before the information was delivered, immediately after, and then one week later. One out of the four groups was the control and was provided unrelated health fact sheets.

The other three groups were delivered standard pro-vaccination information: a myth vs fact booklet, a series of tables contrasting the effects of measles against possible vaccine side effects or a fear-based booklet with pictures of sick, unvaccinated children suffering from measles, mumps or rubella.

The frightening results of the study found that all three information strategies were counter-productive, resulting in the strengthening of belief in vaccine myths. Interestingly, the "myth vs reality" strategy was found to be the most ineffective, inducing stronger beliefs in the link between vaccines and autism and increasing doubts over the general safety of vaccines.

When the "myth vs reality" participants were resurveyed one week later their misconceptions were found to be stronger than before undertaking the experiment in the first place. This indicated that the facts presented in the experiments were quickly forgotten, while the mere presence of the myths caused the original false belief to be amplified.

"These findings offer a useful example of how factual information is misremembered over time," says one of the authors on the study, Professor Sergio Della Sala. "Even after a short delay, facts fade from the memory, leaving behind the popular misconceptions."

As with any limited study, the results are not definitive, but it does suggest public health campaigns need to be very carefully considered. Simply stating false information, even when countering it with factual corrections, can seemingly give a life to, and amplify, those incorrect views.

But where does this leave us? Is there a "backfire effect"? Can we ever correct those who believe outright mistruths?

There are currently no easy answers to these questions, but maybe we'll all think twice next time we try to correct our friend on Facebook with our irrefutable "facts". Sometimes the "facts" are not enough.

The new study on vaccination misinformation was published in the journal PLOS ONE.

Source: University of Edinburgh