You are currently browsing the archives for the Simple Jury Persuasion category.

Follow me on Twitter

Blog archive

We Participate In:

You are currently browsing the archives for the Simple Jury Persuasion category.

ABA Journal Blawg 100!

Subscribe to The Jury Room via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Login

Archive for the ‘Simple Jury Persuasion’ Category

It’s been a while since we’ve had a new cognitive bias to share with you. Previously we’ve blogged on many different biases and here are a handful of those posts. Today’s research paper combines three biases—two of which we’ve blogged about before: the better-than-average effect, confirmation bias and also, the endowment effect. The endowment effect is the “(irrational) added value” we place on things just because they belong to us and not to someone else.

So, today’s research was described over at BPS Digest (a reliable source for accurate summaries), and it’s a bit odd. For the sake of brevity, here’s what BPS Digest says (they are based in England so they don’t spell everything like we do in the States) as they describe the study (we added emphasis to important points with bold font):

Across three studies, the researchers asked hundreds of participants to imagine a fictional planet in a distant solar system, inhabited by various creatures some of which are predators and others prey. Focusing on two of the creatures on the planet – Niffites and Luppites – the participants were told to imagine that they (that is, the participant himself or herself) held one of two different beliefs: Some were told that they had a theory that the Niffites were the predators and the Luppites were their prey, while others were told to assume that somebody called “Alex” had this theory. This background scenarios was chosen to be neutral and unconnected to existing beliefs, and the hypothetical “ownership” of the theory by some of the participants was intended to be as superficial and inconsequential as possible.

Next, the researchers presented the participants with a series of seven facts relevant to the theory. The first few were mildly supportive of the theory (e.g. Niffites are bigger), but the last few provided strong evidence against (e.g. Luppites have been observed eating the dead bodies of Niffites). After each piece of evidence, the participants were asked to rate how likely it was that the theory was true or not.

The way the participants interpreted the theory in the light of the evidence was influenced by the simple fact of whether they’d been asked to imagine the theory was theirs or someone else’s. When it was their own theory, they more stubbornly persisted in believing in its truth, even in the face of increasing counter-evidence.

This spontaneous bias toward one’s own theory was found across the studies: when different names were used for the creatures (Dassites and Fommites); whether testing happened online or in groups in a lecture room; regardless of age and gender; and also when an additional control condition stated that the theory was no one’s in particular, as opposed to being Alex’s. The last detail helps clarify that the bias is in favour of one’s own theory rather than against someone else’s.

The ownership of the theory made the difference in belief persistence. We are reluctant to discard ideas we think of as our own, even when the evidence contradicts it.

From a litigation advocacy perspective, we have talked a lot about how facts don’t matter (still true in 2017 as our most recent post on this topic explains) when it comes to personal beliefs and emotional reactions of jurors. In their paper, the authors of the SPOT effect research say this bias cuts across gender and age and that it “reflects a pro-self as opposed to anti-other bias”. They also comment on how easy it was to create allegiance to a theory (“phenomena that require surprisingly little to bring about”)—just by saying “I have a theory”–participants stood by the beliefs even in the face of evidence to the contrary.

We wonder how much stronger (and emotional) this bias would be if those core values and beliefs held by individual jurors were challenged in case narrative. While this is a new bias just named, it is why (for years now) we have recommended that our client-attorneys try to avoid hot-button issues and instead focus on incorporating universal values into their case narrative.

You are less likely to get knee-jerk reactivity from jurors who have polarized political positions when you use universal values to frame your case narrative (and stay away from unnecessary controversies).

Gregg AP, Mahadevan N, & Sedikides C (2017). The SPOT effect: People spontaneously prefer their own theories. Quarterly Journal of Experimental Psychology, 70 (6), 996-1010 PMID: 26836058

Image 

Share
Comments Off on Simple Jury Persuasion: The SPOT (Spontaneous Preference  for Own Theories) effect 

It is hard to believe that more than two decades have passed since the controversial Time magazine cover featuring OJ Simpson with his skin intentionally darkened was distributed. It was published in 1994 and people were so upset that the magazine’s managing editor issued a public apology for publishing the cover photo. Today, we are covering very recent research that tells us the exact same thing is still going on in a wide variety of publications.

The researchers looked at stories on both celebrities and politicians and found that when the story content was positive, the skin tone on the photos was lighter and when story content was negative, the skin tone on the photos was darker. And this was true regardless of race or gender—darker toned photos were used when the content of the story was negative. The researchers believe there is a pervasive belief that darkness and badness go together. And, in truth, psychology researchers have known this for years. But, say the researchers, this is the first time the reverse has been proven in research: when we hear about an evil act—we assume it was more likely committed by someone with darker skin.

The researchers describe the theoretical background for their work as the “black is bad” literature (the historical associations of darkness with negativity and lightness with positivity). The researchers cite multiple examples of this effect. For example, they point to research that says those who believed Barack Obama’s likeness was best captured in “artificially darkened photos” both evaluated him more negatively and were less likely to have voted for him in the 2008 election. Another example (among many) was that when professional sports teams athletes wore black uniforms, spectators perceived they were behaving more aggressively and referees responded more harshly with penalties.

So, knowing the long history of “black is bad”, the researchers wanted to see if “bad is black”—in other words, “Do people represent actors who commit wrongful acts as having darker skin than actors who commit conspicuously moral acts?”. And, to cut to the chase, the answer is a resounding Yes.

Here are some of their findings:

Across 6 studies, they found people described a person’s skin tone as darker when a morally bad act was committed than when a morally good act was committed.

The “bad is black” effect was associated more strongly with those who reported negative biases against dark-skinned minority groups or those who held beliefs pairing darkness with badness.

Interestingly, both groups displayed bias: the group who strongly associated darkness with negativity show the “bad is black” effect, while those low in a tendency to associate darkness with negativity display the “good is black” effect.

The researchers say it is important for us to understand how the “bad is black” effect emerges so we can sort out how to disrupt it. They express a concern for eye-witness identification (especially the problem-prone area of cross-racial identification). It is certainly possible that an eye-witness could “misremember” the perpetrator as having darker skin and therefore choosing innocent darker skinned people from a line-up rather than the guilty lighter-skinned alternatives.

From a litigation advocacy perspective, this is another point on which you may want to educate jurors with regard to the accuracy of eye-witness identification. However, it is also a point to be aware of for crafting case narrative. The researchers found that “regardless of race or gender” people with darker skins were seen as more guilty whether they were guilty or not. If one of the parties (regardless of race or gender) has darker skin, jurors are more likely to see that party as “bad” in whatever form that may take in your trial.

Alter, A., Stern, C., Granot, Y., & Balcetis, E. (2016). The “Bad Is Black” Effect Personality and Social Psychology Bulletin, 42 (12), 1653-1665 DOI: 10.1177/0146167216669123

Image

Share
Comments Off on Simple Jury Persuasion: The “bad is black” effect 

Disinformation is everywhere you turn these days, so we need good tools to debunk those “alternative facts”. Last year we wrote about a strategy to combat distrust of science by using the concept of the “gateway belief”. While that paper received criticism from a well-known law professor, over at the Cultural Cognition blog, the same research team has come back with a new paper wherein they obliquely mention the criticism and then dismiss it in favor of writing about their new research. They are (again) writing about disinformation on climate change but rather than just using the gateway belief (97% of scientists agree climate change due to human activity is a thing) to persuade, they recommend two strategies based on new research.

The authors (an international group from the UK, Yale, and George Mason) have described their work so (instead of repeating it here) here’s a link where you can see their description of what they did and even download the full paper if you like. We will focus here on their two strategies and tell you that they showed participants either the 97% consensus graphic or the Oregon Global Warming Petition Project (a well-known climate change denial project).

The researchers thought that the 97% consensus graphic would increase belief in climate change and the Oregon Global Warning Petition Project would decrease belief in climate change. And they were right on both counts—but they were surprised by how powerful disinformation was—the disinformation cancelled out the accurate information so that there was no net effect of providing accurate information. So, being scientists, they wondered about a “vaccine” of sorts to minimize the impact of disinformation like that contained in the Oregon Petition Project.

They found two of them—one easy and one more complicated but also more potent. The important thing is to deliver the “inoculation” with the legitimate facts and this will disarm the potency of the disinformation.

Strategy 1: Say this: “Some politically-motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists”.

This general inoculation resulted in participants moving 6.5% toward acceptance of the climate science consensus (despite a followup exposure to fake news). In other words, the fake news “rebuttal” did not work to swing opinions back. Participants were more alert to being misled.

Strategy 2: In this study, the disinformation was the Oregon petition and so they used a detailed inoculation to discredit the petition (after making the Strategy 1 statement). For example, they highlighted signatures that were fraudulent (e.g., Charles Darwin and members of the Spice Girls band) and the fact that less than 1% of those signing the petition even had backgrounds in climate science.

This detailed inoculation (when added to the general inoculation) resulted in almost 13% increases in the general acceptance of climate change (despite a followup exposure to fake news). Again, the fake news “rebuttal” not only did not work but worked even less well than with the general inoculation alone. In other words, much like a recent recommendation from the Poynter journalism group, it isn’t always enough to say it isn’t true, sometimes you need to show them why and how it isn’t true. And that requires more time, and sustained attention.

The researchers comment that tobacco and fossil fuel companies have used these sorts of psychological inoculation strategies in the past to sow seeds of doubt and undermine scientific consensus in the minds of the public. They think this research tells us that the impact of disinformation can be at least “partially reduced” with this approach.

From time to time, every litigator is confronted with a situation in which it is crucial to educate jurors on the disinformation that may be used (as well as giving them information on typical strategies used to undermine accurate information). Then, when they hear the common strategies presented by opposing counsel, they can spot it quickly, and rest assured they have not been fooled. The researchers comment that this strategy works well in our politically polarized society. We think that makes sense but, as always, test it in pretrial research before you roll it out at trial!

Full text is available here.

van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the Public against Misinformation about Climate Change Global Challenges DOI: 10.1002/gch2.201600008

Image

Share
Comments Off on Simple Jury Persuasion: A psychology vaccine for climate  change disinformation

You may have seen our blog post where we talk about research that informs us in patent work to either allow jurors to examine a disputed invention up close or to simply have them view it from a distance. Which strategy we recommend you use all depends on the evidence and your specific case. Today, we have another one of those sorts of research articles that we think gives insight into a way to persuade unobtrusively by using the hands of your expert witness.

In brief, these researchers examined ways “interacting with our world” changes how we think. They asked research participants to figure out how to group 17 zebras into 4 pens and still have an odd number of zebras in each pen. They had some participants use iPads (the modern-day equivalent of pen and paper) while others were given objects with which to represent pens in which they corralled 17 small plastic zebras.

While the iPad users were unable to solve the puzzle, those who were given a chance to manipulate the small objects with their hands were able to successfully solve the problem. (The solution involves overlapping pens—like those Venn diagrams you were exposed to in high school algebra.)

The researchers explain the results by saying that the idea that problem-solving occurs in our heads, is simply incorrect. For some types of problems, we need the benefit of manipulating objects with our hands to successfully identify solutions.

There was another thing mentioned in this article we found of particular interest. We have always found creative people can sometimes make good jurors in patent or high technology cases because they are able to think outside the box and they understand the importance of intellectual property. In this study, the researchers found that creative participants were able to solve the problem faster than others when given manipulatives with which to attempt solutions.

The researchers think this approach (i.e., engaging with the material world) is “an enabling condition for conceptual change”. What that means is, when you are given objects (whether those are small zebras, or your fingers, or something else you can use to visualize) you are more able to make the creative leaps of inference necessary to solve problems that seem impossible to resolve.

So, here is a creative leap of our own.

Consider using a document camera (like an Elmo), or a magnetic board with movable pieces in the courtroom to have your witness or inventor demonstrate how s/he solved an heretofore impossible problem.

When you are encouraging jurors to be creative, you don’t know where that creativity will take them. The uncertainty makes the strategy feel very risky, and most trial lawyers will opt to go with an approach that involves asserting a position and providing evidence that supports it.

But research strongly informs us that if a person ‘discovers’ a truth through their own internal process, they ‘own’ it much more strongly. If you test an ‘aha!’ approach to presenting the science in a focus group or mock trial you can gauge what jurors do with it, and decide whether it is fruitful for your case and your client.

Jurors in patent and high technology cases are always hungry for the “creative spark of inspiration” that resulted in an invention and if they can “see it” through the inventors’ hands—it would be interesting to see whether they would more strongly identify with the inventor’s position.

Vallée-Tourangeau F, Steffensen SV, Vallée-Tourangeau G, & Sirota M (2016). Insight with hands and things. Acta Psychologica, 170, 195-205 PMID: 27569687

Image

Share
Comments Off on Simple Jury Persuasion: Using your expert  witnesses’ hands to help persuade jurors

child-in-aweWhile you don’t want jurors to think your visual evidence was made by poorly trained technicians—here’s a study that tells us something counter-intuitive that you may find useful (we have).

It may not make obvious sense, but you also don’t want jurors to be blown away (i.e., awed, in wonder, overwhelmed by the majesty of your creation) by the videos you show them as you present a case which has scientific or heavily technical information in it.

For this to make sense to you, try to divide your hypothetical jurors into two groups: those with religious beliefs and those without religious beliefs.

When the religious are awed, they are less likely to believe in science as a credible way to understand the world.

When the non-religious are awed they are more inclined to believe in less credible scientific theories that emphasize order over randomness. [Huh?]

Researchers asked 127 undergraduate students to rate the strength of their religious beliefs, using these questions in the following areas [all based on past research]:

Continuous measures of belief in God (anchored at confident atheist and confident believer), belief in an immortal soul, familial religiosity during childhood, and change in belief in God since childhood (i.e., the degree to which the participant had become a more/less confident atheist/believer since childhood). There was also a binary forced-choice question asking whether participants had ever had an experience that convinced them of God’s existence.

Then, the participants were assigned to watch one of three five-minute videos: a neutral nature video, an awe-inducing clip (i.e., a 5-min montage of nature clips from the BBC’s Planet Earth, composed primarily of grand, panoramic shots of plains, mountains, space, and canyons), or a third clip meant to elicit amusement (a montage of comedic nature clips from the BBC’s Walk on the Wild Side).

After the videos, the participants then answered a 10-item “belief in science” scale, using 6 point Likert scales ranging from “strongly disagree” to “strongly disagree” (displayed below and taken from Farias et al., 2013).

You will note these are not questions we can ask in voir dire (at least in most courtrooms) so we are glad these researchers asked them not just once but over three separate studies with a total of 701 participants across the studies.

belief-in-science-scale

Across all three studies, the researchers concluded that while awe draws theists away from scientific explanations (and increases their receptiveness to supernatural explanations), their data only tentatively suggests that the opposite is also true— that awe drives the non-religiously inclined toward science. As the researchers put it:

Indeed, it seems that awe attracts non theists to scientific explanations to the extent that science is framed as explicitly providing order and explanation and eschewing the importance of randomness in the process…

From a litigation advocacy perspective, what this study tells us is that you want to pay attention to the videos you show jurors in a case where science and/or scientific explanations are involved. Shoot for ‘easy to watch’ and ‘informative’, rather than ‘blockbuster’. If your video inspires awe, you run the risk of the religious juror attributing the progress or process to supernatural powers (aka God), which may interfere with issues of human error or liability generally.

If you have a complex, science-related case, consider pretrial testing of visual evidence with jurors to see whether it is usefully informative, or whether it crosses in to “awesome blockbuster”.

Farias M, Newheiser AK, Kahane G, & de Toledo Z (2013). Scientific faith: Belief in science increases in the face of stress and existential anxiety. Journal of Experimental Social Psychology, 49 (6), 1210-1213 PMID: 24187384

Image

Share
Comments Off on Simple Jury Persuasion: Why you don’t want your  trial videos to elicit awe from jurors