You have searched the The Jury Room blog archives for 'cognitive bias'. If you are unable to find anything in these search results, you can try one of these links.

Follow me on Twitter

Blog archive

We Participate In:

You have searched the The Jury Room blog archives for 'cognitive bias'. If you are unable to find anything in these search results, you can try one of these links.

ABA Journal Blawg 100!

Subscribe to The Jury Room via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Login

Search results

Cognitive Biases: A pictorial primer 

Wednesday, October 21, 2015
posted by Douglas Keene

cognitive bias post illustration 2015It’s pretty amazing really. We’re aware of various cognitive biases that come up as we go through our days and see them spouted in various arguments and disagreements. But they’re hard to remember sometimes.

You may have seen the Wikipedia page devoted to cognitive biases but here’s something novel: a pictorial representation of 20 common cognitive biases that you can print out on a single 8.5×11 page of paper. And it’s published in an unexpected place: The Business Insider website. Here it is presented in full but if you have trouble reading it due to size constraints, go take a look at the page linked above and print this out for your routine use and avoid, as the Business Insider says, “screwing up your decisions”.

20 cognitive biases

Image

Share
Comments Off on Cognitive Biases: A pictorial primer 

While you may think you have heard this line recently, this is really (based on new research) what most of us think about ourselves. It is called the “better than average effect” and it is very persistent. We might smirk at politicians who actually say things like this aloud, but that’s only because we tend to keep those thoughts to ourselves. We (persistently) view ourselves as just better than others, and of course, two new research studies underscore this point.

The first study (Tappin & McKay) recruited 270 adults and asked them to judge the desirability of 30 traits representing agency (e.g., hard-working, knowledgeable, competent), sociability (e.g., cooperative, easy-going, warm) and moral character (e.g., honest, fair and principled). Participants also were asked to indicate how desirable the trait was. how much this specific trait described both the average person and how much it described themselves.

While the agency and sociability traits were rated variably, almost all the participants rated themselves much higher on moral character than they rated the average person.

In an intriguing secondary finding, while the researchers found that overall self-esteem was not related to feelings of superiority, overall self-esteem was related to a sense of moral superiority.

In the second study (Howell & Ratliff), researchers used data from the Project Implicit website where people take various psychological tests that measure unconscious or implicit biases. They focused on people who took tests involving weight biases (these are tests that ask how much you—and the average person—prefer thin people to fat people).

Once again, participants rated themselves as less biased against fat people than the average person was and when given feedback that they were indeed biased against fat people, they were defensive. The more they had rated themselves as unbiased, the more defensive about fat bias feedback they were. They were then asked whether they thought the test was valid—unsurprisingly, they did not think it was valid since it contradicted their self-assessments.

The problem with this belief that we are better than others, both in terms of moral superiority and in our belief that we are less biased than others (which apparently we all share) is that it stops us from honestly assessing ourselves. Therefore, we are prevented from taking action to combat our own prejudices and biases (since we don’t think—or won’t admit—that we have them). Typically, when we hear information about those who are biased or less good than we are, we presume the speaker is talking to “those other people” and tune out.

From a litigation advocacy perspective, these studies have important implications for witness preparation, case narrative, and voir dire. We have discussed the importance of knowing when to raise juror awareness of their own biases and when to stay silent on this blog before. We’ve also posted before on when “playing the race card” works and when it doesn’t work.

This research seems to indicate the importance of using those previously published guidances to direct your decisions about witness preparation, voir dire and case narrative in your specific case. Additionally, it will be important to share “redeeming” information on your client’s involvement in positive activities and your client’s life reflecting the values shared universally by jurors (e.g., family, community, education, volunteerism, et cetera).

Tappin, B., & McKay, R. (2016). The Illusion of Moral Superiority Social Psychological and Personality Science DOI: 10.1177/1948550616673878

Howell JL, & Ratliff KA (2016). Not your average bigot: The better-than-average effect and defensive responding to Implicit Association Test feedback. The British Journal of Social Psychology. PMID: 27709628

Image

Share
Comments Off on I am morally superior to others and also less biased than  everyone….

replace-fear-of-the-unkonwnEarlier this week, we wrote on the question of whether those who have a higher score on the Need for Cognition Scale are just lazy (and the answer was no, not really). If you read this blog regularly, you know that bias is where we work and focus. We also like a curious juror (sometimes) and today we focus on how curiosity can address bias by helping jurors make wiser decisions informed by new data.

You may know the authors of this paper for their work at the Cultural Cognition Project (a collaboration among filmmakers, philosophers and psychologists) and the Cultural Cognition blog—both housed at Yale Law School. We also want to be sure you know the author of the plain language interpretation of this paper—Tom Stafford who operates the MindHacks blog focusing on neuroscience and psychology. Stafford wrote an article (based on this paper) for the BBC Future that is user-friendly and easy to understand for those who want to be sure they would like to dive into the full academic article. Stafford introduces the Cultural Cognition group paper with these disheartening sentences:

…people with the most education, highest mathematical abilities, and the strongest tendencies to be reflective about their beliefs are the most likely to resist information which should contradict their prejudices. This undermines the simplistic assumption that prejudices are the result of too much gut instinct and not enough deep thought. Rather, people who have the facility for deeper thought about an issue can use those cognitive powers to justify what they already believe and find reasons to dismiss apparently contrary evidence.

He sets up the Kahan et al. academic article as containing a possible answer to this maddening reality (and thus piques your curiosity to read the full paper). Or at least it piqued our curiosity. What the researchers wanted was to see if the growing political ideology divide would predict reactions to science information. So they devised a measure of how much scientific information/knowledge individual participants had and then checked to see if their political ideology (conservative versus liberal) would be more important than their pre-existing science knowledge when it came to hot button issues like global warming and fracking.

And it was— most scientifically informed liberals judged issues like global warming and fracking as dangerous to people, while most scientifically informed conservatives think that there were fewer risks.

In other words, political ideology was more important than pre-existing science knowledge and education when it came to views toward polarizing topics such as global warming or fracking.

These researchers though, had also devised a second measure—this one assessing science curiosity. And how they structured the curiosity measure was very creative. They disguised the measure as a general social marketing survey wherein participants were asked to identify their interests in a wide variety of items related to sports, finance, politics, popular entertainment and so on. Ultimately, they had a 12-item scale to measure Science Curiosity. They also allowed participants to express a preference as to whether they preferred to read a science story that would confirm their beliefs or surprise them.

What they found was that participants who scored higher on the curiosity scale were more likely to choose the story that would disconfirm their preexisting beliefs (that is, it would surprise them) and the participants enjoyed that process of surprise.

The researchers conclude their paper as follows:

Together these two forms of evidence paint a picture—a flattering one indeed—of individuals of high science curiosity. On this view, individuals who have an appetite to be surprised by scientific information—who find it pleasurable to discover that the world does not work as they expected—do not turn this feature of their personality off when they engage political information but rather indulge it in that setting as well, exposing themselves more readily to information that defies their expectations about facts on contested issues. The result is that these citizens, unlike their less curious counterparts, react more open mindedly, and respond more uniformly across the political spectrum to the best available evidence.

From a litigation advocacy perspective, if we can identify those potential jurors who are curious and enjoy the surprise of learning new things that potentially disconfirm pre-existing beliefs—we have an increased chance of getting them to listen to case facts and come to a different conclusion than they may have come to before hearing the new information. What we have to do is figure out how to surprise them and we have several blog posts on what happens to our brains when we experience surprise.

You can read more about the development of the initial Science Curiosity Scale at the SSRN website.

Kahan, Landrum, Carpenter, Helft, & Jameson (2016). Science curiosity and political information processing. Advances in Political Psychology.

Full-text of this article is here: http://www.culturalcognition.net/browse-papers/science-curiosity-and-political-information-processing.html

Image

Share
Comments Off on Choosing your jurors: On bias, curiosity and  wisdom

forensic neuropsychologyWhether you are involved in criminal or civil litigation, before long you are likely to run into a forensic neuropsychologist and a neuropsychological exam. A new article (mostly directed at civil litigation involving adults) discusses 12 forms of bias and how to mitigate those biases. You may want to review it carefully (or have an expert witness review it carefully) prior to trial. The article is written by three practicing forensic neuropsychologists and is intended to assist both the expert witness and both sponsoring and examining attorneys. For the purposes of this blog post, which is only meant to raise your awareness of this resource, we will list the 12 forms of bias that are identified with the author’s recommendations on how to mitigate. This is an information-rich resource, so for additional background and details, please review the article itself.

Logistical and administrative biases (or how the neuropsychologist has arranged the evaluation and the sources of information upon which they rely).

Conflating clinical and forensic roles. There is a clear distinction between these roles and they should not be mixed. The authors give specific examples and describe the differences between a treating expert and a forensic neuropsychologist charged with assessing and writing a report but not with treatment or advocacy.

Financial/payment bias. The authors describe payment arrangements on a continuum from “straightforward to murky to highly biased”. They recommend a “fee for service” arrangement and offer examples of how alternate arrangements can be questioned in open court.

Referral source bias. The authors describe “Rule 26 disclosure” and how forensic neuropsychologists repeatedly retained by a specific attorney can be seen as “hired guns” by jurors. The authors describe multiple ways you can “see” a referral source bias in a testifying expert.

Self-report bias. The authors describe how some evaluators forget the importance of verifying the report of the examinee with workplace, school and family reports and prior testing to ensure the reports are accurate. They discuss secondary gain, misremembering pre and post injury events, or situation-specific amnesia.

Statistical biases (under-utilization of base rates and ignoring normal variance in test scores).

Under-utilization of base rates. Base rates are often confusing for jurors and it is important that a neuropsychologist uses them accurately even though the authors stress evidence that neuropsychologists are both unaware of base rates and under use them in their evaluations.

Ignoring normal variance in test scores. Another statistical bias is not understanding normal variance in test scores and thus making inappropriate conclusions.

Cognitive, personal and attributional biases.

Confirmation bias. This is a bias we often discuss on our blog and it is also a trap for the unwitting evaluator. Essentially, confirmation bias occurs when you use your pre-existing beliefs to support your hypotheses rather than seeking confirmation in the data.

Personal and political bias. While this may seem to be an obvious bias for the evaluator to guard against, it is commonly seen according to the authors. Additionally, they discuss a term from the psychotherapy arena: countertransference and warn against examinee characteristics “such as age, attractiveness, gender, ethnicity and socioeconomic status” that could bias the examiner either toward or against the examinee.

Group attribution error. This occurs when the examiner makes an assumption about an individual based on the belief that the “individual’s traits are representative of an entire group”. This extends far beyond race and ethnicity with examples offered of examiners who think everyone with Alzheimer’s should present in a certain fashion or everyone with head injuries should have common symptoms, or that everyone with fibromyalgia has a somatoform disorder.

Diagnosis momentum. This is the tendency for a diagnosis to be seen as unquestionably accurate as increasing numbers of people select that specific diagnosis rather than performing a complete evaluation to ensure the validity of the diagnosis of record. This could obviously have major impact on case outcome.

Good old days bias. This is a bias held by the examinee rather than the examiner that may result in self-reports that over-report the level of past function. This makes the examination of prior records imperative and its presence is often seen as a hallmark of a “psychological process that occurs post-injury”.

Overconfidence. This bias happens when an individual neuropsychologist grows sloppy in their work because they feel experienced enough to “know the truth”.

Naming biases seems to be epidemic, kind of like coming up with clever Twitter hashtags. Ultimately, the point is that people try to make sense of confusing or disruptive thoughts and feelings as quickly and effortlessly as they can, even if it requires torturing the truth. Overall, the authors acknowledge there are countless other biases that exist and this is a starting point for assessment of a forensic neuropsychological evaluation. They offer multiple strategies for the forensic evaluator to defend against biases (and thus for the attorney who wishes to examine potential sources of bias in the report). This is a useful resource to keep on hand and use to assess biases that may be present in court-ordered forensic neuropsychological reports.

Effective trial strategies for reducing biases often come from teaching jurors what the possible biases are, and how making smart and correct judgments requires ignoring or avoiding them. Warn jurors of how tempting it can be to race to conclusions, pointing out some of the pitfalls, and tipping them off that getting seduced into getting hooked into these false impressions will not only be a source of error, but for everyone who wants to be correct, it will also be a source of regret.

Richards, P., Geiger, J., & Tussey, C. (2015). The Dirty Dozen: 12 Sources of Bias in Forensic Neuropsychology with Ways to Mitigate Psychological Injury and Law, 8 (4), 265-280 DOI: 10.1007/s12207-015-9235-1

Image

Share
Comments Off on 12 Sources of Bias in Forensic Neuropsychology and  How to Mitigate Them

this and that November 2015This is a conglomeration of articles we thought were interesting and useful but chose not to devote an entire post describing them. Think of this as a series of articles that might pique your interest and make you want to learn more. We’ll provide links so it’s easy to learn more.

Christians and Science: A new stereotype threat?

You’ve probably heard about how women reminded of how men perform better in math do more poorly on math tests than those not reminded. Or about how African-Americans perform more poorly on standardized tests when reminded they tend to do so. Studies like these have been around for the past couple of decades. But here’s a new one—at least to us. A new study says that Christians are stereotyped as being less competent in science and so they do less well on scientific tests and tasks! We wonder whether this is really a “stereotype threat” since those are typically descriptive of minority groups and Christians remain a majority group in this country. Regardless, it’s an interesting factoid.

White People Have Hardships Too

Matt Damon recently apologized for whitesplaining and Miley Cyrus ran into some trouble with Nikki Minaj recently over the same issues. White people seem to have troubles accepting how their lives are so very different due to privilege. We’ve seen this before in the professional literature but the studies keep coming with very similar findings. In this new study, Whites respond to evidence they are privileged by their race by focusing on all the hardships they endure. The study says that having (White) people “self-affirm” before they are shown evidence of privilege will result in fewer claims of hardship (due to decreases in defensiveness).

When works strikes too close to home: Suppose you have the brain of a psychopath?

This is one of those things you just can’t make up. Well, you could make it up—but no one would believe you. Let’s say you have researched the brain scans of psychopaths for more than 20 years and suddenly you look at your own brain scan and it looks like a psychopathic murderer’s brain? That is apparently what happened to Jim Fallon who then went out and discovered (at least based on a story told by his mother) a familial connection to Lizzie Borden (the famous ax murderer). Because he does what he does, Fallon went out and checked the genes and brains of relatives. He was the only one with the brain of a psychopath. It is likely that the definition of ‘normal’ is now a fragile one (at least) for Jimmy Fallon.

You too can reduce prejudice and “turn people into atheists” [if you buy a big machine]

All it apparently takes is a quick zap, aka “transcranial stimulation of the posterior medial frontal cortex”. Oh is that all? A new publication collaborated on by researchers in the US and UK shows us that if you “stimulate” the brain area linked to “responses to threats”, you can reduce prejudicial views (by about 1/3 in this study) held by the person being “stimulated”. Oddly not only does it reduce prejudice (in this study against immigrants) it also reduces religious beliefs. The researchers say that when you are challenged—you defend yourself and by stimulating the brain in this area, they lower the need for defensiveness. They make no real comment on what the reduction in religious conviction means although a writeup of the study makes it clear they’ve been accused of “turning people into atheists”. And if you are left feeling vulnerable, you will likely find this quote comforting:

“It’s worth mentioning that this technique requires a very loud, very expensive, fairly large machine operated by a technician who’s an expert, and there’s no way that I can conceive of that this kind of magnetic energy could be directed into anyone’s brain without their knowledge.”

Personally, we cannot imagine what kind of people signed up to have their brains zapped by a big, loud machine that is described as changing your brain activity— in the name of science, and not as a life-saving intervention.

Phillips, LT, & Lowery, BS (2015). The hard-knock life? Whites claim hardship in response to racial inequity. Journal of Experimental Social Psychology, 61, 12-18

Rios, K., Cheng, Z., Totton, R., & Shariff, A. (2015). Negative Stereotypes Cause Christians to Underperform in and Disidentify With Science Social Psychological and Personality Science, 6 (8), 959-967 DOI: 10.1177/1948550615598378

Image

Share
Comments Off on Things You Want to Know: Stereotypes, biases,  defensiveness, and when work strikes awfully close to home