Does desire trump beliefs based on facts when evaluating scientific evidence?
You probably know the answer to this question is yes. But the real answer is much more nuanced, which makes it so much more interesting. As it happens, if you are conflicted about the facts, you are more likely to be swayed by your desires than the facts themselves. When I was in graduate school, there was a saying about flimsy research that the authors seem to have drawn their curve before they plotted the data. What they wanted to see skewed their view of what was truly there.
Researchers had participants who planned to have children and who believed home care would be superior to day care for children participate in a study. Half of the participants planned to provide home care for any future children and half intended to use day care despite their belief that home care would be superior. This latter half was identified as “conflicted”. Thus half of the participants (who planned to use home care) were motivated to see home care as superior and half (who planned to use day care) were motivated to see day care as superior. (Both groups came into the study believing home care was superior to day care.)
Participants reviewed two separate (and fictional) studies labeled as the “Thompson and Cummings” studies. Half of the participants were told the Thompson study favored home care and the Cummings study favored day care and the other half were told the opposite.
Participants were asked to rate which study design they thought was more valid and then listed the strengths and weaknesses of each study. Then they indicated how convincing or valid they thought each study seemed and what setting (home or day care) they thought (after reading and evaluating the studies) they thought would have better overall effect on their children.
And here’s what happened.
Participants who planned to use home care did not change their minds after reviewing the research studies. But the ‘conflicted’ group who thought home care was superior but planned to use day care changed their minds dramatically following their review of the studies.
That is, the ‘conflicted’ participants changed their view of the superiority of home care and changed their view of the positive aspects of day care. (Home care rating went down dramatically and day care ratings went up dramatically so that neither sort of care was seen as superior.)
We know if we hear a theme repeated enough we can come to belief it to be true. What this research says is that if we want to believe something enough, we will take ambiguous information and interpret it to reflect our desires rather than the facts.
What this means for litigation advocacy is this:
If you have ambiguous information, clearly interpret it for the jury. Do not allow their own desires to interpret the information for them. Unequivocal, clear interpretation gives jurors who support your case words to use in deliberations. And it gives the fence-sitters a reason to favor your side, if they are otherwise tempted to.
If you have dueling expert witnesses pointing fingers in different directions, have your expert witness explain (clearly and matter of factly) how the evidence supports your case. And obviously, to the extent that the research of the opposition can be discredited as being ‘suspect’, it can resolve jurors’ ambivalence.
When this is done, instead of the jurors having to figure out which expert they trust more—they can point to specific data points to advocate for in the deliberation room.
Winning the argument starts with helping the jury to see that they have a reason to ‘want’ to see the evidence as affirming both their values and your client. Then providing them evidence (even if they don’t fully understand it) that is credibly and unequivocally interpreted as affirming that position, allows them to rest assured. Know that people will choose evidence that they see as bolstering their own views whenever they can.
Bastardi A, Uhlmann EL, & Ross L (2011). Wishful thinking: belief, desire, and the motivated evaluation of scientific evidence. Psychological science, 22 (6), 731-2 PMID: 21515736