How Misinformation Can Be Tough To Shift

Misinformation and fake news have become one of the hottest topics of our age, as debates in areas from science to politics have been distorted by actors seeking to manipulate.  In a fast-paced world, it pays therefore to be able to adroitly update one’s views when they come across better information.

This isn’t easy however, even when the consequences are especially grave, such as when new information emerges about a drug that we’re using for an illness.  That’s the finding of a new study from Boise State University in Idaho, which highlights how we construct mental models of events based upon the information we consume, and these are tough to shift once embedded in our minds.

This is a phenomenon known as the continued influence effect, and the researchers tested various ways this could be overcome.  For instance, in one experiment, volunteers were asked to read about a man who was diagnosed with a disease and was taking medication (washed down with lemonade) for it each night, but the medication isn’t working so he returns to the doctor for advice.  Half of the volunteers read a version of the story in which the doctor explained why the drug wasn’t working, whereas in the story read to the other half, this didn’t occur.

Continuing influence

At the end of the story, the volunteers are then informed that citrus-based foods and drinks, such as lemonade, actually hinder the absorption of the drug.  They’re later informed that this information was, in fact, false, but this falsehood didn’t affect everyone the same way.

For instance, those who had previously not been given an explanation by the doctor for the ineffectiveness of the drug were less inclined to cast aside the story about the lemonade affecting the drug’s performance.

“This group had used the citrus interaction to explain why the drug didn’t work in the story, while the other group already had an explanation in mind,” the researchers explain. “Once the first group inserted causal information into a mental model of story, it was harder to remove it.”

Changing our minds

The researchers came to the intuitive finding that it’s much easier to change our minds on a topic if something negative had happened to prompt a rethink.

“People are more motivated to do the mental work of updating the story if the change leads to a better outcome because the character’s well-being could be related to their own well-being,” they explain.

They hope that their findings will prompt a rethink about how organizations and the media handle retractions of misinformation so that they’re more effective.

“It may not work to simply send out a press release or make a public service announcement saying that information is incorrect,” they explain. “In order to effectively change beliefs, we need to give consumers an alternative cause and effect explanation.”

Facebooktwitterredditpinterestlinkedinmail