You are currently browsing the archives for the Simple Jury Persuasion category.

Follow me on Twitter

Blog archive

We Participate In:

You are currently browsing the archives for the Simple Jury Persuasion category.

ABA Journal Blawg 100!

Subscribe to The Jury Room via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.


Archive for the ‘Simple Jury Persuasion’ Category

Seven years ago, we blogged about a disruptive persuasion strategy meant to catch the listener off guard and thus, elicit cooperation. Four years ago, we blogged about a negotiation strategy to help you more successfully negotiate prices (from salaries to farmer’s market produce).

Now, in a new meta-analysis, the strategy is called the pique technique (which is very catchy). The pique technique is a persuasion strategy believed to work by raising the listener’s curiosity and thus disrupting the automatic “No” and encouraging you to engage with the asker. Most people ask, “What is it for?” to an unusual request like “47 cents”.

We are surprised this technique has not caught on with more US panhandlers since it was first written about in 1994. The researchers tell us one possible reason why—we’re living in the wrong country:

“the technique worked significantly better [snip] when a smaller amount was requested, when the reason for the request was included, and when the technique was used in France. Truly. (And just FYI, we are all invited to move to France by their President.)

The point is that you want to use this for smaller requests only and include the reason ‘why’ you want this odd amount of money. And the most important point in terms of success? According to this meta-analysis—you should live (or panhandle) in France. We also want to clarify that there have only been 6 studies done on the pique technique in the past twenty-five years so we are not talking about a high number of studies—maybe it would work well in Scotland too.

Alex Fradera (over at BPS Research Digest) pointed out there is no data to say the curious folks (“What is it for?”) actually gave money more often but says then when the asker explained upfront what the money was for—the listener was more likely to give. However, it only works in amounts up to a dollar and not for more than one dollar.

Oddly enough, we have seen a trend for the past five years or so for “funny signs” being used by panhandlers, as well as an apparent religious renaissance among panhandlers more recently. Both of these strategies are uses of the same sort of disruption tool (at least until it becomes ubiquitous like “God Bless” on panhandler signs). It is a strategy worth exploring for larger amounts.

From a litigation advocacy perspective, we think there are still tools here to use. As we said in 2013, it is worthwhile to consider being very, very specific in your damages requests. It just sounds more credible, and makes you appear thoughtful and prepared.

The researchers believe that when you lead with a precise number, in any sort of negotiation, you send the message that you are prepared, informed, and knowledgeable about the value of either the item for which you are offering money or for salaries in your field. This precise offer leads the recipient of your opening bid to offer a returning number that is higher (when negotiating salaries or bartering on CraigsList items) than you might get if you offer a round initial request.

It’s an interesting piece of work, with applications to salary, mediation offers, jury charge “asks”, auto purchases, and even bartering at the farmer’s market. The researchers recommend bumping a $50 item to $49.75 (if you are the seller) or offering $49.25 if you are the buyer. They also comment that the research “highlight[s] how a lack of awareness about the power of precision may put the recipient of a precise offer at a disadvantage”. It’s intriguing research to experiment with in your day-to-day life–whether for mediation or that nice bunch of carrots next to the bright red radishes.

We have seen many mock jurors become numb to damages award amounts in the many millions of dollars. But a request that asks for $47,976,428 looks unusual and may be seen as more credible in deliberations.

Seyoung Lee and Thomas Hugh Feeley. (2017). A meta-analysis of the pique technique of compliance. Social Influence, 12 (1).


Comments Off on Simple Jury Persuasion: The Pique Technique (The Panhandler’s Persuasion Tool) 

Last year, we posted several times on CRISPR (the revolutionary gene-editing tool) as an example of how to explain something very complex to the novice juror. We’ve also talked a lot about how expert witnesses are not present in the courtroom to show jurors and parties how smart they are—but rather, to educate the jurors.

When we say we want expert witnesses to teach at the high school level—we often get confusion from experts as to  how this can possibly be done. Typically, the expertise of an expert technical witness is in a specific area and it is hard to imagine how to speak at various levels without being condescending. Sometimes they even say they “know how” to talk to people and recommendations to them are unnecessary. Having seen a good many expert witnesses, we would beg their indulgence (and often do) for just a few minutes. And that few minutes begins with a brief video recording of them responding to a cross-examination. Review of the video often softens their insistence that they are ready to go.

WIRED has done a terrific videos on just how to explain and educate across different levels of education and information. Here’s one where a biologist explains CRISPR to a child, a teen, a college student, a graduate student, and a gene editing professional. This video is a terrific example of why we want you to communicate at the high school level. This expert communicates respectfully and in a way that encourages the various people in the video with him to rise to their highest level of performance. They all seem to leave happy and feeling they understand more than when they came in—even though, it is a highly technical and convoluted topic.

One of the things we really like about this video is that the scientist/witness is relaxed, casual, conversational, and respectful of the knowledge level of his (in this case) audience. He demonstrates a terrific way to engage the audience by taking what they already know and elevating their level of understanding just a bit higher.

When your subject matter is technical and esoteric, now you will typically find scientists trying to explain it in simple and often, highly visual terms online. Take CRISPR, for example. This tool has exploded into awareness in the past 18 months (or so). There are many examples of authors using a CRISPR theme in their work, and even television shows planned that will feature CRISPR in scary (most likely) and exciting (perhaps) ways. But, if you are looking for accuracy and educational tools on CRISPR—they are everywhere online.

You can find infographics on CRISPR. You can find videos on CRISPR (at this writing, YouTube had almost 59,000 videos about CRISPR). And you can find basic instructional articles (more than 6M in our search) focused on what CRISPR is for the novice. In a particularly scary turn, you can also find home DIY CRISPR kits so you can experiment with gene editing at home.

Look around. What helps you learn? What would help a high schooler learn and be engaged? That is the level you want to shoot for—jurors who know anything about the topic are typically not on the jury so you need to take them back to high school biology and hold their interest.


Comments Off on Simple Jury Persuasion: Helping Expert  Witnesses Teach Effectively at Trial

It’s been a while since we’ve had a new cognitive bias to share with you. Previously we’ve blogged on many different biases and here are a handful of those posts. Today’s research paper combines three biases—two of which we’ve blogged about before: the better-than-average effect, confirmation bias and also, the endowment effect. The endowment effect is the “(irrational) added value” we place on things just because they belong to us and not to someone else.

So, today’s research was described over at BPS Digest (a reliable source for accurate summaries), and it’s a bit odd. For the sake of brevity, here’s what BPS Digest says (they are based in England so they don’t spell everything like we do in the States) as they describe the study (we added emphasis to important points with bold font):

Across three studies, the researchers asked hundreds of participants to imagine a fictional planet in a distant solar system, inhabited by various creatures some of which are predators and others prey. Focusing on two of the creatures on the planet – Niffites and Luppites – the participants were told to imagine that they (that is, the participant himself or herself) held one of two different beliefs: Some were told that they had a theory that the Niffites were the predators and the Luppites were their prey, while others were told to assume that somebody called “Alex” had this theory. This background scenarios was chosen to be neutral and unconnected to existing beliefs, and the hypothetical “ownership” of the theory by some of the participants was intended to be as superficial and inconsequential as possible.

Next, the researchers presented the participants with a series of seven facts relevant to the theory. The first few were mildly supportive of the theory (e.g. Niffites are bigger), but the last few provided strong evidence against (e.g. Luppites have been observed eating the dead bodies of Niffites). After each piece of evidence, the participants were asked to rate how likely it was that the theory was true or not.

The way the participants interpreted the theory in the light of the evidence was influenced by the simple fact of whether they’d been asked to imagine the theory was theirs or someone else’s. When it was their own theory, they more stubbornly persisted in believing in its truth, even in the face of increasing counter-evidence.

This spontaneous bias toward one’s own theory was found across the studies: when different names were used for the creatures (Dassites and Fommites); whether testing happened online or in groups in a lecture room; regardless of age and gender; and also when an additional control condition stated that the theory was no one’s in particular, as opposed to being Alex’s. The last detail helps clarify that the bias is in favour of one’s own theory rather than against someone else’s.

The ownership of the theory made the difference in belief persistence. We are reluctant to discard ideas we think of as our own, even when the evidence contradicts it.

From a litigation advocacy perspective, we have talked a lot about how facts don’t matter (still true in 2017 as our most recent post on this topic explains) when it comes to personal beliefs and emotional reactions of jurors. In their paper, the authors of the SPOT effect research say this bias cuts across gender and age and that it “reflects a pro-self as opposed to anti-other bias”. They also comment on how easy it was to create allegiance to a theory (“phenomena that require surprisingly little to bring about”)—just by saying “I have a theory”–participants stood by the beliefs even in the face of evidence to the contrary.

We wonder how much stronger (and emotional) this bias would be if those core values and beliefs held by individual jurors were challenged in case narrative. While this is a new bias just named, it is why (for years now) we have recommended that our client-attorneys try to avoid hot-button issues and instead focus on incorporating universal values into their case narrative.

You are less likely to get knee-jerk reactivity from jurors who have polarized political positions when you use universal values to frame your case narrative (and stay away from unnecessary controversies).

Gregg AP, Mahadevan N, & Sedikides C (2017). The SPOT effect: People spontaneously prefer their own theories. Quarterly Journal of Experimental Psychology, 70 (6), 996-1010 PMID: 26836058


Comments Off on Simple Jury Persuasion: The SPOT (Spontaneous Preference  for Own Theories) effect 

It is hard to believe that more than two decades have passed since the controversial Time magazine cover featuring OJ Simpson with his skin intentionally darkened was distributed. It was published in 1994 and people were so upset that the magazine’s managing editor issued a public apology for publishing the cover photo. Today, we are covering very recent research that tells us the exact same thing is still going on in a wide variety of publications.

The researchers looked at stories on both celebrities and politicians and found that when the story content was positive, the skin tone on the photos was lighter and when story content was negative, the skin tone on the photos was darker. And this was true regardless of race or gender—darker toned photos were used when the content of the story was negative. The researchers believe there is a pervasive belief that darkness and badness go together. And, in truth, psychology researchers have known this for years. But, say the researchers, this is the first time the reverse has been proven in research: when we hear about an evil act—we assume it was more likely committed by someone with darker skin.

The researchers describe the theoretical background for their work as the “black is bad” literature (the historical associations of darkness with negativity and lightness with positivity). The researchers cite multiple examples of this effect. For example, they point to research that says those who believed Barack Obama’s likeness was best captured in “artificially darkened photos” both evaluated him more negatively and were less likely to have voted for him in the 2008 election. Another example (among many) was that when professional sports teams athletes wore black uniforms, spectators perceived they were behaving more aggressively and referees responded more harshly with penalties.

So, knowing the long history of “black is bad”, the researchers wanted to see if “bad is black”—in other words, “Do people represent actors who commit wrongful acts as having darker skin than actors who commit conspicuously moral acts?”. And, to cut to the chase, the answer is a resounding Yes.

Here are some of their findings:

Across 6 studies, they found people described a person’s skin tone as darker when a morally bad act was committed than when a morally good act was committed.

The “bad is black” effect was associated more strongly with those who reported negative biases against dark-skinned minority groups or those who held beliefs pairing darkness with badness.

Interestingly, both groups displayed bias: the group who strongly associated darkness with negativity show the “bad is black” effect, while those low in a tendency to associate darkness with negativity display the “good is black” effect.

The researchers say it is important for us to understand how the “bad is black” effect emerges so we can sort out how to disrupt it. They express a concern for eye-witness identification (especially the problem-prone area of cross-racial identification). It is certainly possible that an eye-witness could “misremember” the perpetrator as having darker skin and therefore choosing innocent darker skinned people from a line-up rather than the guilty lighter-skinned alternatives.

From a litigation advocacy perspective, this is another point on which you may want to educate jurors with regard to the accuracy of eye-witness identification. However, it is also a point to be aware of for crafting case narrative. The researchers found that “regardless of race or gender” people with darker skins were seen as more guilty whether they were guilty or not. If one of the parties (regardless of race or gender) has darker skin, jurors are more likely to see that party as “bad” in whatever form that may take in your trial.

Alter, A., Stern, C., Granot, Y., & Balcetis, E. (2016). The “Bad Is Black” Effect Personality and Social Psychology Bulletin, 42 (12), 1653-1665 DOI: 10.1177/0146167216669123


Comments Off on Simple Jury Persuasion: The “bad is black” effect 

Disinformation is everywhere you turn these days, so we need good tools to debunk those “alternative facts”. Last year we wrote about a strategy to combat distrust of science by using the concept of the “gateway belief”. While that paper received criticism from a well-known law professor, over at the Cultural Cognition blog, the same research team has come back with a new paper wherein they obliquely mention the criticism and then dismiss it in favor of writing about their new research. They are (again) writing about disinformation on climate change but rather than just using the gateway belief (97% of scientists agree climate change due to human activity is a thing) to persuade, they recommend two strategies based on new research.

The authors (an international group from the UK, Yale, and George Mason) have described their work so (instead of repeating it here) here’s a link where you can see their description of what they did and even download the full paper if you like. We will focus here on their two strategies and tell you that they showed participants either the 97% consensus graphic or the Oregon Global Warming Petition Project (a well-known climate change denial project).

The researchers thought that the 97% consensus graphic would increase belief in climate change and the Oregon Global Warning Petition Project would decrease belief in climate change. And they were right on both counts—but they were surprised by how powerful disinformation was—the disinformation cancelled out the accurate information so that there was no net effect of providing accurate information. So, being scientists, they wondered about a “vaccine” of sorts to minimize the impact of disinformation like that contained in the Oregon Petition Project.

They found two of them—one easy and one more complicated but also more potent. The important thing is to deliver the “inoculation” with the legitimate facts and this will disarm the potency of the disinformation.

Strategy 1: Say this: “Some politically-motivated groups use misleading tactics to try to convince the public that there is a lot of disagreement among scientists”.

This general inoculation resulted in participants moving 6.5% toward acceptance of the climate science consensus (despite a followup exposure to fake news). In other words, the fake news “rebuttal” did not work to swing opinions back. Participants were more alert to being misled.

Strategy 2: In this study, the disinformation was the Oregon petition and so they used a detailed inoculation to discredit the petition (after making the Strategy 1 statement). For example, they highlighted signatures that were fraudulent (e.g., Charles Darwin and members of the Spice Girls band) and the fact that less than 1% of those signing the petition even had backgrounds in climate science.

This detailed inoculation (when added to the general inoculation) resulted in almost 13% increases in the general acceptance of climate change (despite a followup exposure to fake news). Again, the fake news “rebuttal” not only did not work but worked even less well than with the general inoculation alone. In other words, much like a recent recommendation from the Poynter journalism group, it isn’t always enough to say it isn’t true, sometimes you need to show them why and how it isn’t true. And that requires more time, and sustained attention.

The researchers comment that tobacco and fossil fuel companies have used these sorts of psychological inoculation strategies in the past to sow seeds of doubt and undermine scientific consensus in the minds of the public. They think this research tells us that the impact of disinformation can be at least “partially reduced” with this approach.

From time to time, every litigator is confronted with a situation in which it is crucial to educate jurors on the disinformation that may be used (as well as giving them information on typical strategies used to undermine accurate information). Then, when they hear the common strategies presented by opposing counsel, they can spot it quickly, and rest assured they have not been fooled. The researchers comment that this strategy works well in our politically polarized society. We think that makes sense but, as always, test it in pretrial research before you roll it out at trial!

Full text is available here.

van der Linden, S., Leiserowitz, A., Rosenthal, S., & Maibach, E. (2017). Inoculating the Public against Misinformation about Climate Change Global Challenges DOI: 10.1002/gch2.201600008


Comments Off on Simple Jury Persuasion: A psychology vaccine for climate  change disinformation