Follow me on Twitter

Blog archive

We Participate In:

ABA Journal Blawg 100!

Subscribe to The Jury Room via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Login

The illusion of truth (which is why you should never  repeat fake news)

Monday, May 1, 2017
posted by Douglas Keene

It’s been all about “fake news” for a while now and here’s a study telling us to just stop talking about it. Well, sort of. What it actually says is even when we have knowledge to the contrary, if we hear something repeated enough—we come to believe it. Hence, our recommendation that we need to all stop repeating fake news—even if our comment is on how ridiculous it may seem. It is as if the false statements morph when repeated enough from outrageous to familiar to having a ring of truth. Merely by repetition.

It’s a bit like the dictum we’ve written before to “change the narrative” and not use the same terms the opposition is using to describe something like “death panels” or figuring out how to debunk faked visual imagery. You don’t want to accidentally reinforce the ideas and images of the opposition but you do need to put forth your own narrative. Today’s research offers insight into just how a listener can know something to be false and yet, after hearing it repeated, accept it may be true after all.

As the researchers remind us, “repeated statements are easier to process, and subsequently perceived to be more truthful, than new statements”. Nazi Joseph Goebbels is often credited with a law of propaganda that would be another way to communicate the same idea as the researchers want us to understand: “Repeat a lie often enough and it becomes the truth”. While we may not believe this would ever happen to us, it definitely does with the researchers ultimately concluding that we have “knowledge neglect” and tend to support the conclusion that is easiest for us to support. It’s just too tiring, apparently, to actually think. Against the tide of fake news, it requires endless vigilance.

In this case, the researchers wanted to see if the “illusion of truth” (i.e., hearing falsehoods repeated) would over-ride “stored knowledge” (i.e., things we know to be true). For example, they offered two statements to participants:

The Atlantic Ocean is the largest ocean on Earth.

The Pacific Ocean is the largest ocean on Earth.

The researchers figured most people would know that the Pacific is the largest ocean on Earth and thus that elementary school factoid would be in the “stored knowledge” of most participants. (We questioned this assumption but we should note the participants in the two experiments were all undergraduates at Duke and thus perhaps more able to remember elementary school factoids than those of us who are slightly older.) Despite being fairly young, most of the participants did not put forth enough effort to ponder their prior knowledge and call up the facts—so, the illusion of truth worked. Sometimes. However, when participants did think about their stored knowledge, they chose the correct answer

The authors explain this finding by saying that we can all apply our stored knowledge to every bit of new information that comes to us, but this takes tremendous effort and energy. It requires us to assess the new information against other things we know or think we know. That requires a commitment to stay focused and costs the individual in terms of both energy and effort. We see this sort of fatigue and energy loss often in our mock jurors who are operating at full capacity to come to a decision on what is right during pre-trial research. They are thinking and want to come to the right decisions.

From a litigation advocacy standpoint, however, we have to realize this energy and effort is a limited resource.The energy and understanding of jurors are prey to the illusion of truth effect because our tendency is to use short-cuts in assessing how plausible something is as we hear it. Sometimes this works and sometimes it doesn’t. You can help jurors by using expert witnesses to teach them that something really is true—and it is a provable fact rather than simply a statement of opinion. How?

Start by having the witness ease them into the area that is key to their testimony, by having them first remind jurors of what they know, establishing that the expert relies on knowledge that they believe in. Give them the experience of “Hey—I know what s/he just said is true!” Then move the focus to increasingly unfamiliar territory, after building the credibility connection.

Have the expert witness cite scientific studies finding whatever statement they are testifying to is supportable by recent research. That broadens the connection from the juror to the research, using the expert as a conduit.

Go further by having that expert witness address what the opposing expert may say and why your expert  knows that to be incorrect (citing other research and scientific consensus rather than mere opinion).

What is most important is that you respect jurors’ ability to think and give them reason to want to expend the effort to evaluate the facts in the case for you and for your client. Don’t make this merely about which party is more likable or attractive (although certainly do what you can to portray your client as both likable and similar in values to the jurors).

While the researchers recommend future research focus on how to get people to rely on their own stored knowledge rather than repetition to ascertain truth—until that research is completed—your best strategy is to help jurors think through the facts and answer the questions that are likely to come up for them as they hear the evidence.

Fazio LK, Brashier NM, Payne BK, & Marsh EJ (2015). Knowledge does not protect against illusory truth. Journal of Experimental Psychology. General, 144 (5), 993-1002 PMID: 26301795

Image

Share
%d bloggers like this: