You are currently browsing the archives for the Simple Jury Persuasion category.

Follow me on Twitter

Blog archive

We Participate In:

You are currently browsing the archives for the Simple Jury Persuasion category.

ABA Journal Blawg 100!

Subscribe to The Jury Room via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Login

Archive for the ‘Simple Jury Persuasion’ Category

On July 10, 2017, we published the first part of this post on combatting mistrust in science. As we continued to read, we decided there was more for you to know about this topic so here’s a bit more information.

We wanted to share a couple of ways scientists shoot themselves in the foot when it comes to maintaining credibility. First, they think themselves more rational than the rest of us and second, their over-the-top advocacy for science backfires by making them seem like “just another partisan group”. In research speak, this is an example of scientists being only human, and like the rest of us, falling prey to the better than average effect. At least one writer believes, that scientists, in their rush to be “right”, seem to have forgotten that science itself, is  based on questioning facts.

In our first post, we mentioned Dan Kahan’s vigorous disagreement that science even has a credibility issue. As it turns out, the Pew Research Center (or rather, the US citizens they surveyed for this article) agrees with Dan! In a recent report released by Pew on public confidence in scientists, they point out that public confidence in scientists has remained stable since the 1970s. The graphic here is taken directly from the Pew report and shows how public confidence in both medicine and science have remained roughly stable for decades.

While Pew says (in a widely cited finding) that public trust in institutions is lower today than it was in the mid-1970s—they also say that public confidence in both medicine and the scientific community is higher than it is in many institutions these days.

Who has less of the public confidence than scientists and the medical community? Almost everyone—(in descending order) from K-12 administrators to religious leaders to the news media to business leaders and finally to elected officials.

So what does this mean? It likely means what it’s meant for years now.

When your case relies on science—you need an expert who is able to teach jurors at a high school level without being condescending or incomprehensible.

We’ve seen hundreds of mock jurors tune out very well-credentialed experts who were more interested in showing off their knowledge than in actually communicating.

You want someone who “looks credible” to the jurors but is also able to communicate very complex information at multiple levels so that the audience to whom the expert is speaking understands and feels good about their ability to understand after the testimony.

We agree that there is a sort of anti-intellectual movement in the US today. However, that seems (at least in our experience) to be reserved for those intellectuals who speak at a level incomprehensible to the layperson.

When your witness is able to make the science applicable and relevant to the jurors daily lives—they are not an intellectual elitist.

They are instead, a credible witness who helped jurors understand important issues that will help them render a just decision in a confusing situation.

Image

SaveSave

SaveSave

Share
Comments Off on Simple Jury Persuasion: Combatting mistrust  in science [Part 2]

You’ve likely seen a lot about the high level of mistrust of science in the past few years. Not everyone believes there actually is a science mistrust issue (see this post from Dan Kahan at Cultural Cognition blog) but for a non-problem it certainly gets a lot of coverage! First, here’s a bit of review of a small sampling of the recent “nobody trusts science” literature.

Pew Research published a report in 2015 on which areas science has increasing difficulty being seen as credible. While Pew is objectively reporting survey data, many of the “science mistrust” stories are written by the very people concerned about the issue—scientists and science supporters. The emotionally heated debate has been tweeted about although with the wrong URL for the article which is actually here (perhaps an error due to emotionality?)

This 2014 article from the Nature website tells us why the mistrust of science came about (and what sorts of behavior strengthens mistrust—but the real reason we are including it here is the reference list of multiple articles on the mistrust of science. Even NPR gets into the fray by suggesting that science should own past errors and improve their processes and procedures to gain the trust of the public. And finally, here’s a recent workshop run by the National Academies of Sciences, Engineering and Medicine that gives you current information (2017) on the controversy around science and how scientists can best respond.

So—with all these scientists writing about how no one trusts them—do any of them offer ideas on what to do? Actually, yes they do but you will have to judge for yourself how on-target they are with their suggestions.

Make your science “different” 

This post from Psychology Today discusses the general distrust of “Big Science” [the entire amorphous universe of science findings] but also uses an article we blogged about here to help you figure out how to explain things differently to your jurors. (Other bloggers have written about this article as well.) Here’s part of what Psychology Today has to say:

People differ in their beliefs about the precision and certainty of different sciences in general. These general beliefs affect at least some people’s judgments about whether that science is worthwhile and whether it should be funded by the public. However, group differences [snip] disappear when people start to focus on specific research rather than the science in general.

This pattern of results suggests that researchers may want to start by describing particular studies to people in order to help them understand the research that gets done in science. Then, they should relate the specific findings back to the area of science that it comes from to help people change their general beliefs about the quality of work done in those sciences. In this way, the scientific community can help the broader public to see the benefit and value of the research that gets done.

We see these statements as the province of your expert witness. Use someone who “looks like” a scientist and has some social skills. Make your “science” different from science “in general”. In addition, see our post on the same article for more litigation-specific recommendations (e.g., comfort, curiosity, and counter-intuitive surprises).

Use a “gateway belief” to combat mistrust in science

Scientific American published an article written by a researcher who thinks a gateway belief (in this case,  education on scientific consensus) is a terrific way to combat mistrust in science. They made a statement in the article about how to describe just what scientific consensus means that also carries incredible visual imagery:

“Imagine reading a road sign that informs you that 97% of engineers have concluded the bridge in front of you is unsafe to cross.”

As it happens, we also blogged about this article soon after it was published. Our takeaway from the original research was this: “back up your assertion with facts, people will be persuaded despite pre-existing beliefs and despite their political affiliation”. And, as mentioned earlier, Dan Kahan over at Cultural Cognition blog vigorously disagrees with the original research.

This same research group wrote again in 2017 on their ideas about gateway beliefs (and mentioned Kahan’s disagreement briefly but did not give it much attention). This time they described their work as a “psychological inoculation” against “climate change misinformation”. We also blogged about that article here:

The strategy that is recommended for use against ‘science deniers’ has proven successful to support science deniers in well-documented cases. The researchers comment that tobacco and fossil fuel companies have used these sorts of psychological inoculation strategies in the past to sow seeds of doubt and undermine scientific consensus in the minds of the public. They think this research tells us that the impact of disinformation can be at least “partially reduced” with this approach.

From time to time, every litigator is confronted with a situation in which it is crucial to educate jurors on the disinformation that may be used (as well as giving them information on typical strategies used to undermine accurate information). Then, when they hear the common strategies presented by opposing counsel, they can spot it quickly, and rest assured they have not been fooled.

“You do not have a special corner on truth”

And finally, Popular Mechanics (of all places) puts in a plug for this amazing commencement address from June 2016.

“On June 10th, the New Yorker Staff Writer, surgeon, and medical researcher Atul Gawande delivered a commencement address to the graduating class at the California Institute of Technology on the importance of scientific thinking. Gawande discussed the rise in anti-science sentiment and how to combat the resistance to facts and evidence we’ve seen around issues like vaccines and climate change.”

You can read the full text of this commencement address at the New Yorker or you can watch the 17.5 minute video on YouTube.

Earlier, we mentioned the workshop run by the National Academies of Sciences, Engineering and Medicine. One of their recommendations was to make the science relevant to a practical problem. This article from the Atlantic (published June 24, 2017) on why the extreme heat in Phoenix, Arizona was grounding planes, is a perfect example of showing how science can help us understand the world around us. From a litigation advocacy perspective, you can see that mistrust of science is strongly connected to fear (which we blogged about last week).

Use a friendly, credible and trustworthy expert witness (see if Bill Nye is available…), teaching jurors what opposing counsel’s expert will say and why that doesn’t make sense, as well as showing how “your science” is different from “Big [scary] Science”. It may help your jurors embrace the sound science you are employing, even while remaining unsure about the realm of “Big Science” that they have been taught to fear.

Image

Share
Comments Off on Simple Jury Persuasion: Combatting mistrust in science [Part 1]

Seven years ago, we blogged about a disruptive persuasion strategy meant to catch the listener off guard and thus, elicit cooperation. Four years ago, we blogged about a negotiation strategy to help you more successfully negotiate prices (from salaries to farmer’s market produce).

Now, in a new meta-analysis, the strategy is called the pique technique (which is very catchy). The pique technique is a persuasion strategy believed to work by raising the listener’s curiosity and thus disrupting the automatic “No” and encouraging you to engage with the asker. Most people ask, “What is it for?” to an unusual request like “47 cents”.

We are surprised this technique has not caught on with more US panhandlers since it was first written about in 1994. The researchers tell us one possible reason why—we’re living in the wrong country:

“the technique worked significantly better [snip] when a smaller amount was requested, when the reason for the request was included, and when the technique was used in France. Truly. (And just FYI, we are all invited to move to France by their President.)

The point is that you want to use this for smaller requests only and include the reason ‘why’ you want this odd amount of money. And the most important point in terms of success? According to this meta-analysis—you should live (or panhandle) in France. We also want to clarify that there have only been 6 studies done on the pique technique in the past twenty-five years so we are not talking about a high number of studies—maybe it would work well in Scotland too.

Alex Fradera (over at BPS Research Digest) pointed out there is no data to say the curious folks (“What is it for?”) actually gave money more often but says then when the asker explained upfront what the money was for—the listener was more likely to give. However, it only works in amounts up to a dollar and not for more than one dollar.

Oddly enough, we have seen a trend for the past five years or so for “funny signs” being used by panhandlers, as well as an apparent religious renaissance among panhandlers more recently. Both of these strategies are uses of the same sort of disruption tool (at least until it becomes ubiquitous like “God Bless” on panhandler signs). It is a strategy worth exploring for larger amounts.

From a litigation advocacy perspective, we think there are still tools here to use. As we said in 2013, it is worthwhile to consider being very, very specific in your damages requests. It just sounds more credible, and makes you appear thoughtful and prepared.

The researchers believe that when you lead with a precise number, in any sort of negotiation, you send the message that you are prepared, informed, and knowledgeable about the value of either the item for which you are offering money or for salaries in your field. This precise offer leads the recipient of your opening bid to offer a returning number that is higher (when negotiating salaries or bartering on CraigsList items) than you might get if you offer a round initial request.

It’s an interesting piece of work, with applications to salary, mediation offers, jury charge “asks”, auto purchases, and even bartering at the farmer’s market. The researchers recommend bumping a $50 item to $49.75 (if you are the seller) or offering $49.25 if you are the buyer. They also comment that the research “highlight[s] how a lack of awareness about the power of precision may put the recipient of a precise offer at a disadvantage”. It’s intriguing research to experiment with in your day-to-day life–whether for mediation or that nice bunch of carrots next to the bright red radishes.

We have seen many mock jurors become numb to damages award amounts in the many millions of dollars. But a request that asks for $47,976,428 looks unusual and may be seen as more credible in deliberations.

Seyoung Lee and Thomas Hugh Feeley. (2017). A meta-analysis of the pique technique of compliance. Social Influence, 12 (1).

Image

Share
Comments Off on Simple Jury Persuasion: The Pique Technique (The Panhandler’s Persuasion Tool) 

Last year, we posted several times on CRISPR (the revolutionary gene-editing tool) as an example of how to explain something very complex to the novice juror. We’ve also talked a lot about how expert witnesses are not present in the courtroom to show jurors and parties how smart they are—but rather, to educate the jurors.

When we say we want expert witnesses to teach at the high school level—we often get confusion from experts as to  how this can possibly be done. Typically, the expertise of an expert technical witness is in a specific area and it is hard to imagine how to speak at various levels without being condescending. Sometimes they even say they “know how” to talk to people and recommendations to them are unnecessary. Having seen a good many expert witnesses, we would beg their indulgence (and often do) for just a few minutes. And that few minutes begins with a brief video recording of them responding to a cross-examination. Review of the video often softens their insistence that they are ready to go.

WIRED has done a terrific videos on just how to explain and educate across different levels of education and information. Here’s one where a biologist explains CRISPR to a child, a teen, a college student, a graduate student, and a gene editing professional. This video is a terrific example of why we want you to communicate at the high school level. This expert communicates respectfully and in a way that encourages the various people in the video with him to rise to their highest level of performance. They all seem to leave happy and feeling they understand more than when they came in—even though, it is a highly technical and convoluted topic.

One of the things we really like about this video is that the scientist/witness is relaxed, casual, conversational, and respectful of the knowledge level of his (in this case) audience. He demonstrates a terrific way to engage the audience by taking what they already know and elevating their level of understanding just a bit higher.

When your subject matter is technical and esoteric, now you will typically find scientists trying to explain it in simple and often, highly visual terms online. Take CRISPR, for example. This tool has exploded into awareness in the past 18 months (or so). There are many examples of authors using a CRISPR theme in their work, and even television shows planned that will feature CRISPR in scary (most likely) and exciting (perhaps) ways. But, if you are looking for accuracy and educational tools on CRISPR—they are everywhere online.

You can find infographics on CRISPR. You can find videos on CRISPR (at this writing, YouTube had almost 59,000 videos about CRISPR). And you can find basic instructional articles (more than 6M in our search) focused on what CRISPR is for the novice. In a particularly scary turn, you can also find home DIY CRISPR kits so you can experiment with gene editing at home.

Look around. What helps you learn? What would help a high schooler learn and be engaged? That is the level you want to shoot for—jurors who know anything about the topic are typically not on the jury so you need to take them back to high school biology and hold their interest.

Image

Share
Comments Off on Simple Jury Persuasion: Helping Expert  Witnesses Teach Effectively at Trial

It’s been a while since we’ve had a new cognitive bias to share with you. Previously we’ve blogged on many different biases and here are a handful of those posts. Today’s research paper combines three biases—two of which we’ve blogged about before: the better-than-average effect, confirmation bias and also, the endowment effect. The endowment effect is the “(irrational) added value” we place on things just because they belong to us and not to someone else.

So, today’s research was described over at BPS Digest (a reliable source for accurate summaries), and it’s a bit odd. For the sake of brevity, here’s what BPS Digest says (they are based in England so they don’t spell everything like we do in the States) as they describe the study (we added emphasis to important points with bold font):

Across three studies, the researchers asked hundreds of participants to imagine a fictional planet in a distant solar system, inhabited by various creatures some of which are predators and others prey. Focusing on two of the creatures on the planet – Niffites and Luppites – the participants were told to imagine that they (that is, the participant himself or herself) held one of two different beliefs: Some were told that they had a theory that the Niffites were the predators and the Luppites were their prey, while others were told to assume that somebody called “Alex” had this theory. This background scenarios was chosen to be neutral and unconnected to existing beliefs, and the hypothetical “ownership” of the theory by some of the participants was intended to be as superficial and inconsequential as possible.

Next, the researchers presented the participants with a series of seven facts relevant to the theory. The first few were mildly supportive of the theory (e.g. Niffites are bigger), but the last few provided strong evidence against (e.g. Luppites have been observed eating the dead bodies of Niffites). After each piece of evidence, the participants were asked to rate how likely it was that the theory was true or not.

The way the participants interpreted the theory in the light of the evidence was influenced by the simple fact of whether they’d been asked to imagine the theory was theirs or someone else’s. When it was their own theory, they more stubbornly persisted in believing in its truth, even in the face of increasing counter-evidence.

This spontaneous bias toward one’s own theory was found across the studies: when different names were used for the creatures (Dassites and Fommites); whether testing happened online or in groups in a lecture room; regardless of age and gender; and also when an additional control condition stated that the theory was no one’s in particular, as opposed to being Alex’s. The last detail helps clarify that the bias is in favour of one’s own theory rather than against someone else’s.

The ownership of the theory made the difference in belief persistence. We are reluctant to discard ideas we think of as our own, even when the evidence contradicts it.

From a litigation advocacy perspective, we have talked a lot about how facts don’t matter (still true in 2017 as our most recent post on this topic explains) when it comes to personal beliefs and emotional reactions of jurors. In their paper, the authors of the SPOT effect research say this bias cuts across gender and age and that it “reflects a pro-self as opposed to anti-other bias”. They also comment on how easy it was to create allegiance to a theory (“phenomena that require surprisingly little to bring about”)—just by saying “I have a theory”–participants stood by the beliefs even in the face of evidence to the contrary.

We wonder how much stronger (and emotional) this bias would be if those core values and beliefs held by individual jurors were challenged in case narrative. While this is a new bias just named, it is why (for years now) we have recommended that our client-attorneys try to avoid hot-button issues and instead focus on incorporating universal values into their case narrative.

You are less likely to get knee-jerk reactivity from jurors who have polarized political positions when you use universal values to frame your case narrative (and stay away from unnecessary controversies).

Gregg AP, Mahadevan N, & Sedikides C (2017). The SPOT effect: People spontaneously prefer their own theories. Quarterly Journal of Experimental Psychology, 70 (6), 996-1010 PMID: 26836058

Image 

Share
Comments Off on Simple Jury Persuasion: The SPOT (Spontaneous Preference  for Own Theories) effect