You are currently browsing the archives for the Beliefs & values category.

Follow me on Twitter

Blog archive

We Participate In:

You are currently browsing the archives for the Beliefs & values category.

ABA Journal Blawg 100!

Subscribe to The Jury Room via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Login

Archive for the ‘Beliefs & values’ Category


When litigation cases rely on science or highly technical information, it is critical to help jurors understand the information underlying the case at a level that makes sense to them. If they do not understand your “science”, they will simply guess which party to vote for or “follow the crowd”. Here’s an example of what happened when scientists “followed the crowd” to see what fields of science were seen as most precise (and therefore reliable).

You can see from the graphic illustrating this post that too many people are watching CSI shows on TV. When forensic science is more “certain” than nanotechnology or aerospace engineering, and even mechanical physics—we have a problem! The authors actually agree with us in this press release:

“The map shows that perceptions held by the public may not reflect the reality of scientific study,” Broomell said. “For example, psychology is perceived as the least precise while forensics is perceived as the most precise. However, forensics is plagued by many of the same uncertainties as psychology that involve predicting human behavior with limited evidence.”

Be that as it may, when these researchers from Carnegie Mellon set out to see which branches of science the public feels most certain about—that is what they found. It is part and parcel of the frustration we often see from our attorney-clients when what they have presented is not what our mock jurors have retained.

We’ve also talked about the newer findings of political polarization coloring reactions to almost everything. The researchers also found political differences in the evaluation of whole fields of science here (again quoting the press release from Carnegie Mellon).

While political affiliations are not the only factor motivating how science is perceived, the researchers did find that sciences that potentially conflict with a person’s ideology are judged as being more uncertain. “Our political atmosphere is changing. Alternative facts and contradicting narratives affect and heighten uncertainty. Nevertheless, we must continue scientific research. This means we must find a way to engage uncertainty in a way that speaks to the public’s concerns,” Broomell said.

In other words, people believe what they choose to believe and you can’t predict how they will engage or not engage. And finally, they also make this comment which we are hearing more and more in the mass media—essentially, the responses these participants gave have no apparent relation to facts.

However, our results also suggest that evaluations of specific research results by the general public (such as those produced by climate change, or the link between autism and vaccination) may not be strongly influenced by accurate information about the scientific research field that produced the results.

This is an area of lament from many who deal in data and facts. We are living in a post-expert world (and some would say post-truth and post-facts). So what are you to do?

From a litigation advocacy perspective, this study tells us how important it is to make scientific research relevant and common-sense (or even counter-intuitive) to your jurors. They need to understand it and have it make sense to them or be allowed to revel in the counter-intuitive nature of the findings.

Feeling comfortable with the “science” (whatever it may be) is a much better way to ensure consistency from your jurors than relying on the chart illustrating this post to predict how listeners will react to the particular science upon which your case relies.

Broomell, S., & Kane, P. (2017). Public perception and communication of scientific uncertainty. Journal of Experimental Psychology: General, 146 (2), 286-304 DOI: 10.1037/xge0000260

Image taken from the article itself

Share
Comments Off on Which science is most “certain” according to the American public? 

Here’s another this-and-that post documenting things you need to know but that we don’t want to do a whole post about–so you get a plethora of factoids that will entertain your family and entrance your co-workers. Or at least be sort of fun to read and (probably) as awe-inspiring as the stack of vegetables and fruit illustrating the post.

Just don’t do it: How bringing up politics ruins your workplace

You probably know this already since many people say their Facebook feeds are a toxic combination of politics and rage these days. So. Bringing up politics up at work is now officially a bad thing. We used to think that being exposed to varying ideas in the workplace broadened all our world views. But that was before this round of extreme political polarization and the strong feelings on both sides of the aisle. Here’s a survey from Wakefield Research and workplace consultants Betterworks that gives factual information on workplace conflict surrounding politics. While reading it won’t make you feel that much better, it will certainly tell you that your own workplace is not the only one so negatively charged (and give you some tips on dealing with employees obsessively checking social media).

Can you trick narcissists into actually feeling empathy?

Recent research says yes you can—simply by reminding them to take the other person’s perspective. In short, the researchers found that those high in narcissistic traits (but not meeting diagnostic criteria) were able to demonstrate perspective-taking but they had to be directed to do so. We have talked about this when it comes to implicit racial biases so the idea is not entirely new, but it is an interesting idea that narcissists would not even consider basic empathy (i.e., imagining the other person’s perspective) unless prompted to do so.

More on beards—this time in healthcare

Just like tattoos, we have covered beards a lot here and addressed issues related to beards like women’s preferences in long-term relationships, bearded men and sexism, extra punitiveness towards bearded men, bearded experts in East Texas, genetics and your bushy beard, and even identifying the elusive lumbersexual on your jury. There is so much debate and research about beards that we’ll give you that link again so you can catch up on all things beard in this blog. Mostly the only question never adequately addressed is “what is it about beards that mobilizes any sort of attitude at all?”

This particular controversy on beards has apparently been going on since the 1800s so it is a bit surprising we don’t have something on it already. Doctors. Should they have beards? Is it a hygiene issue? Should they be able to look older, wiser, and more knowledgeable than they may be chronologically by growing a beard? Scientific American blogs has an entry telling us (among other things) that “beards retained microorganisms and toxin despite washing with soap and water” and that bearded surgeons should “avoid wiggling the face mask” to prevent bacterial contamination during surgery. There are multiple other studies cited that come down on both sides of this hygiene debate. You will want to know about this one. Even though your life won’t be improved by the debate.

Earworms—they’re back!!!!

We’ve also blogged about earworms a number of times (hey—it’s an important topic!) Buzzfeed recently published a list of pop songs likely to get stuck in your head—which is what an earworm is—by definition. As a public service, here is one of our top choices for “most likely to give you an earworm” pop song.

And now that you have that list of songs to give you earworms—here’s recent research giving you a “cure” for the earworm. Chew some gum! The researchers say when you are chewing gum your brain is unable to form the associations essential for the creation and maintenance of an ear worm. Okay then. We can’t say if it’s true (and apparently it doesn’t work for everyone) but go buy some gum (it’s for science).

Throwing out advances in knowledge (is that what we want to do?)

We have lived in The Age of Reason (aka the Enlightenment) since emerging from the darkness and magical thinking of the Middle Ages. A new opinion piece from Daniel J. Levitin, an educator (published at the Daily Beast) asks us to consider whether we really want to live in an era where we avoid rational thought. It’s a brief and well-written piece that will give you talking points on why a return to the Middle Ages or even the 1950s is not a goal for which we should strive.

Beaman, CP, Powell, K, & Rapley, E (2015). Want to block eagworms from conscious awareness? Buy gum! The Quarterly Journal of Experimental Psychology,, 68 (6), 1049-1057.

Hepper EG, Hart CM, & Sedikides C (2014). Moving Narcissus: Can Narcissists Be Empathic? Personality & Social Psychology Bulletin, 40 (9), 1079-1091 PMID: 24878930

Image

Share
Comments Off on Don’t do this at work, beards, ear worms, narcissists, &  discarding advances in knowledge

In 2014, we wrote about research investigating how people felt when a witness wore a veil such as some forms of a hijab or a niqab. Here were some of the findings we described in that research.

We’ve written a number of times about bias against Muslims. But here’s a nice article with an easy to incorporate finding on how to reduce bias against your female client who wears a Muslim head-covering. (In case you have forgotten, we’ve already written about head-coverings for the Muslim man.)

The graphic illustrating this post shows the variety of head-coverings Muslim women might wear and the initial findings (as to which head covering style results in the most bias) will probably not surprise you. Researchers did four studies to see how people reacted to Muslim women wearing veils. They consistently found these reactions:

Responses were more negative when the Muslim woman wore a veil of any kind compared to no veil at all.

When the various veils were compared, the niqab or burqa (where only the eyes are exposed or even the eyes are covered) were seen most negatively.

Today’s research goes beyond bias caused by face veils and looks at whether observers are able to detect deception in witnesses wearing veils (as compared to those not wearing veils). The researchers cite three fairly recent (post-2000) cases resulting in judges in the USA, the UK and Canada ruling witnesses cannot wear the niqab when testifying, in part, say the researchers, because they believed it necessary to see a person’s face to detect deception.

The researchers decided to test that assumption by comparing the ability to detect deception when a testifying witness wore a face covering veil versus when the witness did not wear a face covering veil. They ran a study in Canada with 232 participants and then a second study with participants from Canada, the UK and the Netherlands (with a total of 291 participants) and came to a perhaps surprising conclusion. While the detection of deception in unveiled witnesses was no better than chance—the same was not true for those witnesses who wore veils.

“Observers were more accurate in detecting deception in witnesses who wore niqabs or hijab than those who did not veil.”

The researchers say that (contrary to the assumptions underlying court decisions in three countries) the witness who wore a veil did not hamper lie detection—but rather improved it. Why? They make several hypotheses:

Researchers think participants in the “veiled” condition may have interpreted “eye gaze information” more accurately.

Participants had less visual information to attend to and thus were more likely to base their decisions on verbal than non-verbal information.

In short, the researchers think their participants were forced by the situation to rely more on verbal behavior and to focus their attention on the eyes of the witness in the veiled condition. This is actually consistent with the research we’ve covered in our multiple posts on deception detection research. Examples from detection research such as narrowing your focus from multiple cues to just a few or even one cue, examining eyebrows, having certain personality characteristics of your own, how much the witness uses profanity, and even how long it has been since the witness has used a bathroom, and much more are all mentioned in the research as aiding in deception detection. And then there are all of the things jurors often believe point to deception that truly do not help them to identify who is a truth teller and who is a liar.

In this research, the participants could examine eyebrows in the veiled condition and their focus was certainly narrowed so they were less likely to be distracted by irrelevancies—that alone likely improved their ability to detect deception. This is an interesting study that tells us the common reliance we see among mock jurors on non-verbal indicators to detect deception and even the court rulings since 2000 are outdated when it comes to jurors’ ability to detect deception in a witness. Like the researchers say in their article title, less is actually more when it comes to detecting deception.

We made some recommendations to reduce bias against your veil-wearing client back in 2014 and we would still make those recommendations today.

Here they are:

The researchers say that for the least bias, if a religious Muslim woman wants to wear a head-covering, the hijab is likely the best choice. That may, however, not be an option given her religious beliefs.

In either case, this research would say to give jurors information about your client’s choice to wear a Muslim head-covering (of any style) and it will reduce negative assumptions.

The very process of sharing the reasons for wearing a head-covering with jurors, gives them the opportunity for emotional connection with your client. Her sharing reasons for the head-covering allows them to ‘see’ her individuality and religious conviction.

We’d call that both making your client more similar to the jurors (through the use of universal values) and giving jurors an opportunity to see “beneath the head-covering” to the woman herself.

Leach AM, Ammar N, England DN, Remigio LM, Kleinberg B, & Verschuere BJ (2016). Less is more? Detecting lies in veiled witnesses. Law and Human Behavior, 40 (4), 401-10 PMID: 27348716

Image

Share
Comments Off on Identifying deception when the witness wears a face-covering veil

When we began this blog in 2009, the reality that facts don’t matter was one of the first posts we wrote. We wrote again about this reality back in 2011. And we’ve written about it several times since then so…here we go again!

In this new era of fake news and fake news allegations, we’ve seen a surge in the number of “fact checkers” employed by the media to verify accuracy of statements made by people in this country’s leadership. Some think the publicizing of fact checking can be effective against the spread of misinformation. New research (conducted during the 2016 Presidential election) tells us (yet again) that while fact checking is certainly of value, it depends on whether your intended audience is listening.

That is, while fact checking helped study participants understand what was true and not true, that knowledge made no difference in their voting behavior.

While that disturbing reality sinks in, here’s a brief summary of the research which was published by the Royal Society Open Science Journal and concentrated on statements (both factual and inaccurate) made by candidate Trump during the Republican primary campaign of 2016. The researchers conducted their research online with 2,023 participants. As part of the study, participants were presented with four inaccurate statements and four accurate statements made by candidate Trump (you can see the list of statements in the article itself, but they include misstatements on the unemployment rate and the relationship between vaccines and autism). Sometimes the statements were attributed to Trump and other times they were not attributed to any of the candidates. Then, inaccurate statements were corrected using non-partisan sources such as the Bureau of Labor Statistics. So far so good. When the researchers corrected the false statements, belief in those statements fell across the board.

That is, belief in the Trump falsehoods fell for Trump-supporting Republicans, Republicans favoring other candidates, and for Democrats.

However, the researchers continued on and examined who the supporters intended to vote for—and the correction of misinformation (and reported self-awareness of the inaccuracy of the statements) made no difference in for whom the Republican participant planned to vote. The only participants less likely to vote for Trump were the Democrats (who’d not planned to vote for him anyway).

The researchers conclude that while fact-checking can change people’s beliefs, their strength of partisanship has an effect on the strength of the change when it comes to voting intention. And, perhaps not surprisingly, the researchers wonder just what would have to happen to change voting intention in the face of strong partisan beliefs. They suggest that people “use political figures as a heuristic to guide evaluation of what is true or false, yet do not necessarily insist on veracity as a prerequisite for supporting political candidates”.

If you don’t think that makes sense, you are not alone (we don’t think it makes much sense either). For years, we have believed (and seen it borne out time after time) that political affiliation is not a difference that makes a difference when it comes to decision-making on litigation cases. Yet, we are seeing increasing amounts of research telling us the USA is so split along partisan lines that perhaps, at least right now, it is a difference that makes a difference. We still have not seen it in our work but you can bet we are watching it closely in ongoing pretrial research. Stay tuned.

Swire, B., Berinsky, A., Lewandowsky, S., & Ecker, U. (2017). Processing political misinformation: comprehending the Trump phenomenon. Royal Society Open Science, 4 (3) DOI: 10.1098/rsos.160802

Image

Share
Comments Off on Facts [still] don’t matter: the 2017 edition 

Back in October of 2016, we wrote about a paper by the Cultural Cognition Project on assessing “scientific curiosity”. Here is some of what we said then about what Kahan and his colleagues found by measuring scientific curiosity:

“What they found was that participants who scored higher on the curiosity scale were more likely to choose the story that would disconfirm their preexisting beliefs (that is, it would surprise them) and the participants enjoyed that process of surprise.”

We concluded that 2016 post this way:

From a litigation advocacy perspective, the challenge is to identify  jurors who are curious and enjoy the surprise of learning new things—even when the new information may be in conflict with pre-existing beliefs. This is a subgroup for which we have an increased chance of persuading them to accept change (typically a very difficult task). What we have to do is figure out how to surprise them and we have several blog posts on what happens to our brains when we experience surprise.

So with that backdrop as a reminder, today we bring you a study that is pretty far afield of our usual focus on social science findings with relevance to litigation advocacy. This is a scientific study on genetics that found something unexpected: a personality trait that was related to overall intelligence but actually embedded in the genes. The researchers refer to it as “a molecular genetic overlap” between intellectual ability and curiosity.

You likely know we’ve written a number of times about curiosity and when we like to see that trait in our jurors. The issue is always how to measure curiosity. While Kahan and his colleagues at the Cultural Cognition Project traversed a lengthy route to assessing scientific curiosity—we may not actually have to go to all that trouble.

First, let’s discuss some research vocabulary. What we refer to as curiosity is referred to by researchers as something different. They use the term “openness to experience” (renaming familiar things to make them sound exotic is a proven strategy for getting academic tenure). There are multiple ‘scales’ to measure openness to experience (here’s an example) but typically, they are not appropriate for use in court. Neither, unfortunately, is the Kahan version of a science curiosity scale. Here, however, is an intriguing finding from some molecular geneticists. Scintillating and yet mind-numbing, all in the interest of our blog readers!

Today’s research article:

This molecular genetics research is based on work from a project known as the Cognitive Genomics Consortium (COGENT) and this particular paper was written by a team of more than 60 international researchers who examined the “genes of 35,000 people – measuring the brain function of these participants through tests of learning, memory, and other cognitive function components”. But that is all backdrop so you know just how credible this finding is for us.

“Interestingly, and for the first time, the COGENT researchers also discovered a molecular genetic overlap between cognitive ability and personality. They found that genetic predispositions towards higher cognitive ability were linked to greater “openness to experience.” In order [sic] words, some of the genes that make people more likely to be curious about new ideas and experiences are the same as those that enhance cognitive ability.”

The researchers see this finding (based on a sample of 35,000 people) as instructive for research and treatment for disorders like schizophrenia, autism, and ADHD—and they have plans to expand their study to include more than 100,000 DNA samples. Fortunately, our intentions are less lofty.

We see this as a secret weapon for voir dire. 

How so? If we know that hard-wired into our genes (in this “molecular genetic overlap”), cognitive ability and curiosity go hand in hand—we can use that information to make the quick decisions often required in voir dire and jury selection. We do not need to assess “openness to experience” (or “science curiosity” or even curiosity). All we need to do is look to see who is smart and we will then know we have curious jurors (although, in some cases, we will prefer jurors who are not so smart and therefore, not so curious).

And, as we often say to our clients—especially in rural areas like the far east and west ends of Texas, “smart does not necessarily mean highly educated”. It is typically, however, a lot easier to see or hear “smart” than it is to see or hear “curious” (or open to experiences). So it is a voir dire short cut (which can qualify as a secret weapon).

We would also add in a caveat. There is a difference between those that are merely curious but do not enjoy the analytical process, and those who are both curious and who enjoy thinking and analyzing. We think of this distinction as the difference between jurors who are “high complexity” and “low complexity”.

If your case is very complex, you will want high complexity jurors (who will almost always be curious but also enjoy the process of thinking and analyzing).

If your case is not that complex, or the complexity of the fact pattern works against your case, you will want low complexity jurors who rely more on biases and heuristics (their pre-existing belief systems) to make decisions in cases where they are unfamiliar with the content matter.

We don’t really recommend you go and read this article since we don’t understand much of it and doubt you will either unless you happen to be a molecular geneticist. In this case, we encourage you to trust the interpretation linked to above (which we verified in a few different places like here and here and here). This is a new sort of finding and they have their excitement about it and we have ours. Of course, if you are curious, you can try to understand it!

Trampush, J., Yang, M., Yu, J., Knowles, E., Davies, G., Liewald, D., Starr, J., Djurovic, S., Melle, I., Sundet, K., Christoforou, A., Reinvang, I., DeRosse, P., Lundervold, A., Steen, V., Espeseth, T., Räikkönen, K., Widen, E., Palotie, A., Eriksson, J., Giegling, I., Konte, B., Roussos, P., Giakoumaki, S., Burdick, K., Payton, A., Ollier, W., Horan, M., Chiba-Falek, O., Attix, D., Need, A., Cirulli, E., Voineskos, A., Stefanis, N., Avramopoulos, D., Hatzimanolis, A., Arking, D., Smyrnis, N., Bilder, R., Freimer, N., Cannon, T., London, E., Poldrack, R., Sabb, F., Congdon, E., Conley, E., Scult, M., Dickinson, D., Straub, R., Donohoe, G., Morris, D., Corvin, A., Gill, M., Hariri, A., Weinberger, D., Pendleton, N., Bitsios, P., Rujescu, D., Lahti, J., Le Hellard, S., Keller, M., Andreassen, O., Deary, I., Glahn, D., Malhotra, A., & Lencz, T. (2017). GWAS meta-analysis reveals novel loci and genetic correlates for general cognitive function: a report from the COGENT consortium Molecular Psychiatry, 22 (3), 336-345 DOI: 10.1038/mp.2016.244

Full text available here: http://www.nature.com/mp/journal/vaop/ncurrent/full/mp2016244a.html.

Image

Share
Comments Off on A secret weapon for voir dire: Smart people are more curious