You are currently browsing the The Jury Room blog archives for November, 2011.

Follow me on Twitter

Blog archive

We Participate In:

You are currently browsing the The Jury Room blog archives for November, 2011.

ABA Journal Blawg 100!









Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Login

Archive for November, 2011

Eye witness identification is notoriously inaccurate and yet jurors rely on it heavily. Those working within the system decry this reliance but there are few remedies proposed. Until now.

Multiple studies have shown jurors are unable to distinguish between eye witnesses testifying at trial were correct in their identifications. Researchers thought juror inability to discriminate between accurate and inaccurate eye witnesses might be due to their assessment of witness confidence on the stand. Since the actual identification is typically made months or years before trial, the eye witness is often just testifying to what they said before, not to what they can say definitively from seeing the defendant in the courtroom. Opinions and memories (be they accurate or false) tend to crystallize over time, say these researchers, so most eye witnesses are going to be confident on the stand. Even if the witness is asked how long it took them to identify the defendant or how long it has been since the incident–jurors must rely on witness recall rather than their own assessment.

So the researchers planned a fairly involved experiment including conducting a mock trial with jurors either viewing courtroom examination of the eye witness only or examination of the witness in court plus viewing the video of the initial identification of the defendant (live lineup or photo lineup). Participants then filled out questionnaires including these questions, among others:

How likely is it that the witness identified the actual perpetrator as opposed to an innocent person?

If this witness’ testimony were the only evidence against the defendant, how likely would you be to convict the defendant?

Please rate the witness’ confidence about the identification. 

How sure are you about this [verdict] decision?

Jurors who saw both the courtroom examination and the initial identification video were more accurate in identifying accurate versus inaccurate eye witnesses. Obviously, they found the defendant guilty more often when the eye witness was accurate versus inaccurate. They made fewer overall convictions than jurors who were not shown the video of the original identification. [And in fact, when jurors were only shown the courtroom testimony of the eyewitness there was no discrimination between accurate and inaccurate witnesses and no difference in the number of convictions jurors made based on eyewitness accuracy.]

The researchers recommend that all eyewitness descriptions of the perpetrator and the original identification task (whether lineup or photographs) be video recorded as trial evidence. They believe having videotaped identification will help multiple members of the criminal justice system. It could help police recall details about identification tasks while preparing for trial, could aid prosecutors in making decisions about whether to take cases with weak identification to trial, and could help jurors to assess the credibility of eye witness identifications.

This may well be a promising avenue for further exploration. Although discouraging that, according to the literature review, the videotaping of witness identification of defendants was recommended as early as 1992–the increase in concern about eye witness accuracy over the last decade could help with implementation of this sort of strategy.

Reardon, M., & Fisher, R. (2011). Effect of viewing the interview and identification process on juror perceptions of eyewitness accuracy Applied Cognitive Psychology, 25 (1), 68-77 DOI: 10.1002/acp.1643

Currently, you can see full-text of this article here.

 

We made the ABA Blawg 100 list for the second year! Please take a minute to vote for us HERE under the Trial Practice category. 

Image

Share
Comments Off

Don’t try to bend me to your will!

Monday, November 28, 2011
posted by Douglas Keene

Remember Uri Geller of spoon-bending fame? He could bend spoons as if they were Gumby. Although he now describes himself as a “mystifier”, he initially described himself as a psychic. And many believed. Their minds were changed by their observations of Mr. Geller.

It isn’t so easy in the realm of things that really matter. When confronted with new information on social issues, like the economy or the environment, the less we know, the more we resist becoming well-informed! The more pressing and important the issue–the stronger our resistance to learning. The researchers refer to this as “ignorance is bliss”. They studied the effect in two countries: Canada and the United States.

Researchers gave either a complex or simple description of the economy to Canadian adults (58 participants, average age 42). Those who were given the complex description saw themselves as more helpless to get through the economic downturn, had more dependence on/trust in the government to manage the economy and had less desire to learn more about the issue.

American subjects (163 participants, average age 32) gave their opinions on natural resource management and then read a statement declaring the US had less than forty years of oil supplies. Similar results were found–those who saw themselves as lacking knowledge about oil supplies both avoided negative information about the issue and became even more reluctant to learn more when the need was urgent.

The researchers say this is a result of 1) a lack of information about an issue 2) resulting in increased trust in the government and then 3) avoidance of information that would challenge their trust in the government. They concluded that people simply do not want to understand pressing social issues. We had to read this one twice. The research findings are certainly inconsistent with our experience. We believe that the researchers may be trying to over-simplify.

First, we see fewer and fewer mock jurors who trust the government. Whether they are well versed in complex issues or not, they are increasingly skeptical.

Second, we’ve done multiple mock trials and concept focus groups on both the economy and the energy industry and have learned there are pretty straightforward ways to help jurors believe they can understand (as well as helping them actually understand) what they need to grasp to make decisions on complex cases.

The researchers do mention that “participants who felt an issue was “above their heads” reported an increased desire to adopt an “ignorance is bliss” mentality toward that issue”. And perhaps therein lies the difference.

Academic research is all about describing behavior within a certain set of variables. They make no attempt to shape behavior or modify it. In the eyes of researchers, it is what it is. Applied research, such as the focus groups we conduct for clients, explore where people start getting lost, what they do with their confusion, and how we can satisfy their need to know through a better story, a better witness, or better evidence.

Litigation advocacy involves empowering people to understand more than they believe they can understand, so they can feel confident in their decision-making. We make repeated attempts to help real people understand complex information and feel competent to make judgments on it.

We prefer our work to theirs!

Shepherd S, & Kay AC (2011). On the perpetuation of ignorance: System dependence, system justification, and the motivated avoidance of sociopolitical information. Journal of Personality and Social Psychology PMID: 22059846

Image 

Share
Comments Off

We try to keep up with the persuasion literature. You may recall our post on the effect of tilting your head.  This one is about tilting your entire body and how that will influence your decision-making.  If this strikes anyone as being petty and superficial, we are totally sympathetic. On the other hand, if making your story more persuasive is worthwhile– even through some pretty superficial techniques– we are willing to learn it and to share the news.

There is a theory called the ‘mental number line’ that says “people mentally place magnitudes (symbolized by numbers) on a line, with small numbers on the left and large numbers on the right”. While we have never heard of this theory before it makes intuitive sense–unless you are Israeli and read from right to left. According to the researchers, there are also studies that indicate we associate our left hand or our left visual field with smaller numbers and the right hand and right visual field with larger numbers. We can buy that one too. But here is where it gets a little strange.

Researchers decided to check to see if leaning slightly to the left would result in smaller numerical estimates for the same stimuli than leaning slightly to the right. So they messed with a Wii Balance Board to make it force people to lean slightly off center (to the right or left) in order get feedback that indicated that they were standing up straight. Then they asked them to estimate answers to questions where the range of answers would be between 1 and 10. None of the participants knew any of the accurate answers to the questions (so researchers knew they were estimating).

And you likely will not be shocked to find out that participants who leaned slightly to their left made smaller estimates in their answers than those who leaned slightly to their right on the Wii balance board. And none of the participants was aware of the slight lean but they still made lower estimates when leaning to the left.

So what does this mean for you? For one thing, it’s another odd but interesting bit of trivia. Second, you may be interested in using the knowledge in the courtroom. In addition to the ‘tilt your head’ post, we’ve also talked about placing your exhibits to one side or the other to gain an advantage in memory with jurors.

If you are plaintiff, you may want to place your exhibits to jurors’ right hand/right visual field. You could consider rotating the text on the exhibit just a bit so it leans slightly to the right. (These researchers had the participants off center by a mere 2% shift of their weight–a small shift.) The idea is that when tilting to the right, jurors will ‘see’ the need to estimate larger numbers for your client.

If you are defense, place your exhibits on the left and again, consider leaning the damages text slightly to the left. Obviously, you do not want the tilting text to be disturbing to the viewer so be careful to only tilt it very slightly. Your hope is that jurors will ‘see’ the damages against your client as being smaller.

Who knows whether this will make a difference in juror deliberations. But if it does, aren’t you glad you knew?

Eerland, A., Guadalupe, T., & Zwaan, R. (2011). Leaning to the left makes the Eiffel Tower seem smaller: Posture-modulated thought. Psychological Science

Image

Share

“Myside bias”: I was wrong and so are you

Wednesday, November 23, 2011
posted by Douglas Keene

We’ve commented on economists writing about psychological principles before on this blog. We tend to enjoy their different perspective. And we especially enjoyed seeing this retraction in the Atlantic by an economist who learned about a form of bias we see often in the courtroom.

Essentially, in the first article, Buturovic and Klein said liberals were less knowledgeable on economic questions than were conservatives. And a firestorm erupted. The authors were accused of either rigging the results or thanked for validating a long-held belief about the inferiority of liberals.

They listened, and they learned. A new study (based on feedback they received from the original work) found a different pattern.  For the second study, they added nine additional questions that balanced the study so that the questions either challenged conservative beliefs or challenged liberal beliefs. This time, it didn’t find any difference between the two groups. Their retraction goes as follows:

“One year ago, we reported the results of a 2008 Zogby survey that purported to gauge economic enlightenment. [snip] We also found that that self-identified Progressives and Liberals did much worse than Conservatives and Libertarians, and this finding generated a lot of controversy. Those results were based on eight questions [snip] that specifically challenged leftist positions and/or reassured conservative and/or libertarian positions, while none had a clear slant against conservatives and/or libertarians. 

In a new survey, conducted in December 2010, we supplemented those eight questions with another nine new questions, all specifically challenging conservative and/or libertarian positions (and often reassuring leftist positions). [snip] However, the new test consisting of all 17 questions yielded results that vitiated prior evidence of the left being worse. Now, all groups do poorly, with each group tending to do relatively poorly on the questions challenging its positions.”

So, in other words, there is no evidence that liberals or conservatives are smarter when it comes to economics. The differences found were all about the ‘myside bias’–more commonly referred to as the confirmation bias. We want to have our preexisting beliefs validated. We see it pretty consistently with mock jurors.

This week we did a mock trial and divided the jurors into deliberation groups based on age, gender, education and income. On these characteristics, they were as evenly balanced as possible. And then we sat and watched the deliberations to see some jurors responding in favor of the defense and others passionately supporting the plaintiff. As we listened, it seemed to come down to a decision-making style difference. Some emotionally focused on the morality of the issues inherent in the story, while others pounded the evidence and facts and need for personal responsibility.

The point is, that like the economist Daniel Klein, we all have biases.  And to revisit that sage, Paul Simon, “a man hears what he wants to hear and disregards the rest.”  When challenged, we see see and hear things through the lens of our values and personal experience. Ultimately, we interpret the meaning of what we see and what we hear in accordance with our own biases. It’s a frustrating experience for all involved. Most of us are neither as open nor as gracious in acknowledging our emotional and cognitive errors as is Daniel Klein. For that, he is to be commended. Now if he can just figure out how we can all block that bias in the first place…

Buturovic, Z., & Klein, D.B. (2010). Economic Enlightenment in Relation to College-going, Ideology, and Other Variables: A Zogby Survey of Americans. Econ Journal Watch

Klein, D.B., & Buturovic, Z. (2011). Economic Enlightenment Revisited: New Results Again Find Little Relationship Between Education and Economic Enlightenment but Vitiate Prior Evidence of the Left Being Worse. Econ Journal Watch.

Shepherd S, & Kay AC (2011). On the perpetuation of ignorance: System dependence, system justification, and the motivated avoidance of sociopolitical information. Journal of Personality and Social Psychology PMID: 22059846

 

Image

 

 

 

 

 

 

 

Share
Comments Off

Last year we wrote an article on bias against atheists and how to mitigate those biases in court. It was a really interesting paper to research and write, as the vitriol in the bias against atheists is stunningly powerful and (seems to be) permanent.

This week we saw an article at Miller-McCune on a new research article regarding atheists and had to go take a look. What the researchers say is that we use religiosity as a signal for trustworthiness. If you have no religion, then you are deemed untrustworthy. And, as the researchers say, “trustworthiness is the most valued trait in other people”. This clearly does not bode well for general attitudes about atheists.

The researchers examined the relationship (in the public imagination) between atheism and perceptions of amorality. They did six separate experiments including one in which students read a brief vignette about a man:

“Richard is 31 years old. On his way to work one day, he accidentally backed his car into a parked van. Because pedestrians were watching, he got out of his car. He pretended to write down his insurance information. He then tucked the blank note into the van’s window before getting into his car and driving away. 

Later the same day, Richard found a wallet on the sidewalk. Nobody was looking, so he took all of the money out of the wallet. He then threw the wallet in a trash can.”

Then participants were asked if this (amoral) man was more likely to be: a Christian, a Muslim, a rapist, or an atheist.  Research subjects chose atheist and rapist as most likely.  And they chose atheist in similar numbers to rapist. If you are wondering how in the world anyone would attribute any of these things to the behavior, we sympathize with you.  But the research is all about whether a form of amoral or immoral behavior is seen as consistent with antisocial behavior or religious beliefs.

Other experiments included a workplace choice between a religious candidate and an atheist with totally matching credentials. The positions they were considering were a high-trust position (child care) and a low-trust position (waiter). Participants chose the religious candidate for the high-trust (childcare) position and the atheist for the low-trust (waiter) position.

In another study, ‘Richard’ suffered from some pretty gross and visible physical ailments.  You guessed it– he was seen as more likely to be an atheist. The results are disturbingly consistent. We just don’t trust atheists.

The authors indicate these are the first studies to look at what exactly underlies anti-atheist prejudices. They found (perhaps not surprisingly) in 5 of the 6 studies that belief in God was a potent predictor of atheist distrust. One of the hypotheses the authors identify is this:

“The perceived norms of atheists might simply be more threatening to religious individuals that those of other groups. This is likely because, although religious people might infer that ethnic out-group members of homosexuals hold norms that differ from their own, atheists might be seen as holding norms that are directly antithetical to their own. Alternatively, atheists may be distrusted because people are unsure exactly what atheists believe. A Christian, for example, might be able to infer some of a Muslim’s norms, but an atheist might be viewed as a wildcard: religious people might distrust atheists not only for the norms they are perceived to follow but also for their perceived lack of norms.”

In other words, the atheist is seem by the public as unpredictable and likely without moral standards. We just don’t know what they might do! Atheism is such an affront to what religious people believe that atheists tend to be dehumanized.

After we wrote our research article on anti-atheist prejudices, we got a number of heartfelt emails saying “thanks” for writing an article that brought to light what was previously a dirty secret. The writers of those emails were touchingly human and clearly not used to be treated as such in writing.

As a trial lawyer, if your client is an atheist, there are steps you need to take to protect them and minimize prejudice against them. We outline those in our article and hope you will educate yourself on the intensity of the anti-atheist bias in this country. It’s pretty astounding.

Gervais WM, Shariff AF, & Norenzayan A (2011). Do you believe in atheists? Distrust is central to anti-atheist prejudice. Journal of Personality and Social Psychology PMID: 22059841
Image

Share