Archive for the ‘Decision-making’ Category
When litigation cases rely on science or highly technical information, it is critical to help jurors understand the information underlying the case at a level that makes sense to them. If they do not understand your “science”, they will simply guess which party to vote for or “follow the crowd”. Here’s an example of what happened when scientists “followed the crowd” to see what fields of science were seen as most precise (and therefore reliable).
You can see from the graphic illustrating this post that too many people are watching CSI shows on TV. When forensic science is more “certain” than nanotechnology or aerospace engineering, and even mechanical physics—we have a problem! The authors actually agree with us in this press release:
“The map shows that perceptions held by the public may not reflect the reality of scientific study,” Broomell said. “For example, psychology is perceived as the least precise while forensics is perceived as the most precise. However, forensics is plagued by many of the same uncertainties as psychology that involve predicting human behavior with limited evidence.”
Be that as it may, when these researchers from Carnegie Mellon set out to see which branches of science the public feels most certain about—that is what they found. It is part and parcel of the frustration we often see from our attorney-clients when what they have presented is not what our mock jurors have retained.
We’ve also talked about the newer findings of political polarization coloring reactions to almost everything. The researchers also found political differences in the evaluation of whole fields of science here (again quoting the press release from Carnegie Mellon).
While political affiliations are not the only factor motivating how science is perceived, the researchers did find that sciences that potentially conflict with a person’s ideology are judged as being more uncertain. “Our political atmosphere is changing. Alternative facts and contradicting narratives affect and heighten uncertainty. Nevertheless, we must continue scientific research. This means we must find a way to engage uncertainty in a way that speaks to the public’s concerns,” Broomell said.
In other words, people believe what they choose to believe and you can’t predict how they will engage or not engage. And finally, they also make this comment which we are hearing more and more in the mass media—essentially, the responses these participants gave have no apparent relation to facts.
However, our results also suggest that evaluations of specific research results by the general public (such as those produced by climate change, or the link between autism and vaccination) may not be strongly influenced by accurate information about the scientific research field that produced the results.
This is an area of lament from many who deal in data and facts. We are living in a post-expert world (and some would say post-truth and post-facts). So what are you to do?
From a litigation advocacy perspective, this study tells us how important it is to make scientific research relevant and common-sense (or even counter-intuitive) to your jurors. They need to understand it and have it make sense to them or be allowed to revel in the counter-intuitive nature of the findings.
Feeling comfortable with the “science” (whatever it may be) is a much better way to ensure consistency from your jurors than relying on the chart illustrating this post to predict how listeners will react to the particular science upon which your case relies.
Broomell, S., & Kane, P. (2017). Public perception and communication of scientific uncertainty. Journal of Experimental Psychology: General, 146 (2), 286-304 DOI: 10.1037/xge0000260
Image taken from the article itself
When my kids were younger, I used to talk to them about the difference between intent and impact as they struggled to understand the varying reactions of people to their behavior. Back in 2009, we posted on some new research showing that we reacted more indignantly when bad deeds were done “on purpose”. Here is some of what we wrote then and you may want to visit that post in full as well:
This is an intriguing study because it speaks to the heart of telling the emotional story at trial. You want jurors to have an emotional response—a connection to your story, to your client. You want them to ‘want to’ find for your client, and see him or her as a worthy recipient of their support. What this research tells us is that if the pain inflicted on your client was ‘intentional’, jurors may have a stronger emotional response to it. [snip]
Your goal is to light the fire of moral indignation in the minds of the jurors. You want to answer both aspects of the common juror refrain “it may be legal but it sure isn’t right”. Show them it isn’t right. Show them it isn’t legal. Give them facts to buttress their feelings in deliberations.
It is research we often consider when we hear that common refrain from our mock jurors—“it may be legal but it sure isn’t right”. But this is eight years later and technology has advanced to the point that we now have research telling us a brain scan can tell us whether someone was acting “knowingly” as opposed to “recklessly”.
We are grateful the researchers point out that their technique “represents a proof of concept, and not yet a usable tool”. Nevertheless, expect to hear this one coming to a courtroom before too long (much like the other neurolaw defenses we’ve covered here before).
Here’s what they did. The researchers used “neuroimaging and machine-learning techniques” (aka fMRI) to display varying brain activities related to whether the defendant “knew he was carrying drugs” or “merely aware of a risk that he was”. They clarify this question of “criminal intent” is what criminal juries must determine—in other words, was the defendant’s behavior “knowing” or was it “reckless”?
While there have been studies using fMRIs before this one, the authors say there are “no fMRI studies [snip] that have attempted to determine whether and how the ‘culpable mental states’ map onto differential activations in the human brain”. In other words, if you know you are behaving illegally do different parts of your brain “light up” as compared to when you are aware you might be acting illegally but proceed recklessly.
So, fMRIs are expensive but the researchers did 40 of them (20 men and 20 women). Half the participants were told they were carrying a suitcase containing contraband (the “knowing” condition) and half were told their suitcase might have contraband in it (the “reckless condition”). After that introduction, into the fMRI machines they went.
Those in the knowing condition (who knew they were carrying contraband) were more likely to “light up” in the anterior insula (said by the authors to be involved in the assessment of risk and uncertainty) and the dorsomedia area of the prefrontal cortex (said by the authors to be involved in assessing probabilities) of the brain.
Those in the reckless condition were more likely to “light up” the occipital cortex (said by the authors to reflect higher uncertainty).
The researchers comment on the small sample size and other issues with their study that preclude generalizability. For this show of reticence and respect for statistical realities we are grateful. The reality is that, no matter what areas of the brain light up, we can’t know if that shows the difference between “knowing” and “reckless” or if it is simply a response to risk level. Not to mention, these were imagined behaviors and not real ones.
Our mock jurors have been very suspicious of neurolaw findings and whether you can “prove” they mean what the researchers say they mean. Neurolaw developments remain a very interesting but “not ready for prime time” area of research, or perhaps better said, “not ready for a Daubert challenge”. If you are interested in knowing more about neurolaw, here’s a review of the book Law and Neuroscience.
Vilares I, Wesley MJ, Ahn WY, Bonnie RJ, Hoffman M, Jones OD, Morse SJ, Yaffe G, Lohrenz T, & Montague PR (2017). Predicting the knowledge-recklessness distinction in the human brain. Proceedings of the National Academy of Sciences of the United States of America, 114 (12), 3222-3227 PMID: 28289225
Here’s another this-and-that post documenting things you need to know but that we don’t want to do a whole post about–so you get a plethora of factoids that will entertain your family and entrance your co-workers. Or at least be sort of fun to read and (probably) as awe-inspiring as the stack of vegetables and fruit illustrating the post.
Just don’t do it: How bringing up politics ruins your workplace
You probably know this already since many people say their Facebook feeds are a toxic combination of politics and rage these days. So. Bringing up politics up at work is now officially a bad thing. We used to think that being exposed to varying ideas in the workplace broadened all our world views. But that was before this round of extreme political polarization and the strong feelings on both sides of the aisle. Here’s a survey from Wakefield Research and workplace consultants Betterworks that gives factual information on workplace conflict surrounding politics. While reading it won’t make you feel that much better, it will certainly tell you that your own workplace is not the only one so negatively charged (and give you some tips on dealing with employees obsessively checking social media).
Can you trick narcissists into actually feeling empathy?
Recent research says yes you can—simply by reminding them to take the other person’s perspective. In short, the researchers found that those high in narcissistic traits (but not meeting diagnostic criteria) were able to demonstrate perspective-taking but they had to be directed to do so. We have talked about this when it comes to implicit racial biases so the idea is not entirely new, but it is an interesting idea that narcissists would not even consider basic empathy (i.e., imagining the other person’s perspective) unless prompted to do so.
More on beards—this time in healthcare
Just like tattoos, we have covered beards a lot here and addressed issues related to beards like women’s preferences in long-term relationships, bearded men and sexism, extra punitiveness towards bearded men, bearded experts in East Texas, genetics and your bushy beard, and even identifying the elusive lumbersexual on your jury. There is so much debate and research about beards that we’ll give you that link again so you can catch up on all things beard in this blog. Mostly the only question never adequately addressed is “what is it about beards that mobilizes any sort of attitude at all?”
This particular controversy on beards has apparently been going on since the 1800s so it is a bit surprising we don’t have something on it already. Doctors. Should they have beards? Is it a hygiene issue? Should they be able to look older, wiser, and more knowledgeable than they may be chronologically by growing a beard? Scientific American blogs has an entry telling us (among other things) that “beards retained microorganisms and toxin despite washing with soap and water” and that bearded surgeons should “avoid wiggling the face mask” to prevent bacterial contamination during surgery. There are multiple other studies cited that come down on both sides of this hygiene debate. You will want to know about this one. Even though your life won’t be improved by the debate.
We’ve also blogged about earworms a number of times (hey—it’s an important topic!) Buzzfeed recently published a list of pop songs likely to get stuck in your head—which is what an earworm is—by definition. As a public service, here is one of our top choices for “most likely to give you an earworm” pop song.
And now that you have that list of songs to give you earworms—here’s recent research giving you a “cure” for the earworm. Chew some gum! The researchers say when you are chewing gum your brain is unable to form the associations essential for the creation and maintenance of an ear worm. Okay then. We can’t say if it’s true (and apparently it doesn’t work for everyone) but go buy some gum (it’s for science).
Throwing out advances in knowledge (is that what we want to do?)
We have lived in The Age of Reason (aka the Enlightenment) since emerging from the darkness and magical thinking of the Middle Ages. A new opinion piece from Daniel J. Levitin, an educator (published at the Daily Beast) asks us to consider whether we really want to live in an era where we avoid rational thought. It’s a brief and well-written piece that will give you talking points on why a return to the Middle Ages or even the 1950s is not a goal for which we should strive.
Beaman, CP, Powell, K, & Rapley, E (2015). Want to block eagworms from conscious awareness? Buy gum! The Quarterly Journal of Experimental Psychology,, 68 (6), 1049-1057.
Hepper EG, Hart CM, & Sedikides C (2014). Moving Narcissus: Can Narcissists Be Empathic? Personality & Social Psychology Bulletin, 40 (9), 1079-1091 PMID: 24878930
After we published that “molecular genetics overlap” post showing curiosity is found in smart people—one of our readers asked exactly how you “see” smart during voir dire. The question was posed on Twitter but the answer is not exactly expressed in 140 characters—so we’re doing it here. Among other things, we made these comments in that post:
All we need to do is look to see who is smart and we will then know we can select curious jurors (while considering whether our client’s case benefits from higher levels of intelligence and curiosity).
And, as we often say to our clients—especially in rural areas like the far east and west ends of Texas, “smart does not necessarily mean highly educated”. It is typically, however, a lot easier to see or hear “smart” than it is to see or hear “curious” (or open to experiences). So it can be a voir dire short cut (which can qualify as a secret weapon).
So. Here are some of the signs that a member of the venire might be smart and curious, regardless of their level of formal education:
Do they have less formal education than expected for the job they hold?
Do they have a creative occupation that requires quick associations or problem-solving?
Are they a researcher (of any sort) or scientist?
Do they write professionally (books, magazines, blogs, et cetera, in both fiction and nonfiction)?
Are they a long time manager in a fast paced workplace setting?
Do they work in a fast-changing field?
Are they a technology worker familiar with NDAs and confidentiality?
Do they have hobbies that involve curiosity, problem-solving, or thinking?
Do they think of themselves as questioning authority, or going along with the group consensus?
Does the juror have a history of being self-employed and an above average income?
It is always important to consider whether any particular trait or attitude reflects more or less receptivity to your trial story. In our experience, most attitudes and traits don’t make much difference, but the few that do can have a profound effect on their view of evidence and argument, and their predisposition toward one verdict or another.
There are certainly other characteristics we look for to “see” or “hear” smart, but this is a brief list to get you started. Each case will result in variations on what we look for in a “smart” juror for that specific case (just as there are many kinds of intelligence).
We’ve written a number of times about bias against Muslims. But here’s a nice article with an easy to incorporate finding on how to reduce bias against your female client who wears a Muslim head-covering. (In case you have forgotten, we’ve already written about head-coverings for the Muslim man.)
The graphic illustrating this post shows the variety of head-coverings Muslim women might wear and the initial findings (as to which head covering style results in the most bias) will probably not surprise you. Researchers did four studies to see how people reacted to Muslim women wearing veils. They consistently found these reactions:
Responses were more negative when the Muslim woman wore a veil of any kind compared to no veil at all.
When the various veils were compared, the niqab or burqa (where only the eyes are exposed or even the eyes are covered) were seen most negatively.
Today’s research goes beyond bias caused by face veils and looks at whether observers are able to detect deception in witnesses wearing veils (as compared to those not wearing veils). The researchers cite three fairly recent (post-2000) cases resulting in judges in the USA, the UK and Canada ruling witnesses cannot wear the niqab when testifying, in part, say the researchers, because they believed it necessary to see a person’s face to detect deception.
The researchers decided to test that assumption by comparing the ability to detect deception when a testifying witness wore a face covering veil versus when the witness did not wear a face covering veil. They ran a study in Canada with 232 participants and then a second study with participants from Canada, the UK and the Netherlands (with a total of 291 participants) and came to a perhaps surprising conclusion. While the detection of deception in unveiled witnesses was no better than chance—the same was not true for those witnesses who wore veils.
“Observers were more accurate in detecting deception in witnesses who wore niqabs or hijab than those who did not veil.”
The researchers say that (contrary to the assumptions underlying court decisions in three countries) the witness who wore a veil did not hamper lie detection—but rather improved it. Why? They make several hypotheses:
Researchers think participants in the “veiled” condition may have interpreted “eye gaze information” more accurately.
Participants had less visual information to attend to and thus were more likely to base their decisions on verbal than non-verbal information.
In short, the researchers think their participants were forced by the situation to rely more on verbal behavior and to focus their attention on the eyes of the witness in the veiled condition. This is actually consistent with the research we’ve covered in our multiple posts on deception detection research. Examples from detection research such as narrowing your focus from multiple cues to just a few or even one cue, examining eyebrows, having certain personality characteristics of your own, how much the witness uses profanity, and even how long it has been since the witness has used a bathroom, and much more are all mentioned in the research as aiding in deception detection. And then there are all of the things jurors often believe point to deception that truly do not help them to identify who is a truth teller and who is a liar.
In this research, the participants could examine eyebrows in the veiled condition and their focus was certainly narrowed so they were less likely to be distracted by irrelevancies—that alone likely improved their ability to detect deception. This is an interesting study that tells us the common reliance we see among mock jurors on non-verbal indicators to detect deception and even the court rulings since 2000 are outdated when it comes to jurors’ ability to detect deception in a witness. Like the researchers say in their article title, less is actually more when it comes to detecting deception.
We made some recommendations to reduce bias against your veil-wearing client back in 2014 and we would still make those recommendations today.
Here they are:
The researchers say that for the least bias, if a religious Muslim woman wants to wear a head-covering, the hijab is likely the best choice. That may, however, not be an option given her religious beliefs.
In either case, this research would say to give jurors information about your client’s choice to wear a Muslim head-covering (of any style) and it will reduce negative assumptions.
The very process of sharing the reasons for wearing a head-covering with jurors, gives them the opportunity for emotional connection with your client. Her sharing reasons for the head-covering allows them to ‘see’ her individuality and religious conviction.
We’d call that both making your client more similar to the jurors (through the use of universal values) and giving jurors an opportunity to see “beneath the head-covering” to the woman herself.
Leach AM, Ammar N, England DN, Remigio LM, Kleinberg B, & Verschuere BJ (2016). Less is more? Detecting lies in veiled witnesses. Law and Human Behavior, 40 (4), 401-10 PMID: 27348716