Archive for the ‘Bias’ Category
I first heard the term “over-valued belief” back in the mid-1990’s when I worked in forensic rehabilitation with a man adjudicated not guilty by reason of insanity. He had been very ill (psychotic) and very violent when unmedicated (and had killed more than once due to delusional beliefs) but had been in treatment and well-medicated for years when I met him.
One day he confided that he had been late for our treatment group because he couldn’t stop flushing the toilets on his ward. Later I asked him what he meant and he explained that when the State Legislature was in session and voting on bills, he felt he could also “vote” and perhaps sway their opinions. If he flushed the toilet at the right end of the group bathroom it was a vote for the Republican opinion and if he flushed a toilet at the left end of the group bathroom it was a vote for the Democrat perspective.
I asked him if the strategy worked and he grinned at me—“If I thought it worked, it would be a delusion and I am not delusional anymore. It’s just an over-valued belief at this point”. When I persisted by tilting my head and looking curious, he grinned more widely—“At this point, I can’t stop myself from doing it sometimes “just in case” but it only happens with bills that are really important”.
That lesson stuck with me so when I saw this article on the importance of defining the difference between a delusional belief and an over-valued idea—I knew it would end up as a blog post. It’s a good distinction to be aware of and perhaps especially important for those working on the criminal justice system.
In the aftermath of violent acts such as mass shootings, many people assume mental illness is the cause. After studying the 2011 case of Norwegian mass murderer Anders Breivik, University of Missouri School of Medicine researchers are suggesting a new forensic term to classify non-psychotic behavior that leads to criminal acts of violence.
“When these types of tragedies occur, we question the reason behind them,” said Tahir Rahman, M.D., an assistant professor of psychiatry at the MU School of Medicine and lead author of the study. “Sometimes people think that violent actions must be the byproduct of psychotic mental illness, but this is not always the case. Our study of the Breivik case was meant to explain how extreme beliefs can be mistaken for psychosis, and to suggest a new legal term that clearly defines this behavior.”
Breivik, a Norwegian terrorist, killed 77 people on July 22, 2011, in a car bombing in Oslo and a mass shooting at a youth camp on the island of Utøya in Norway. Claiming to be a “Knights Templar” and a “savior of Christianity,” Breivik stated that the purpose of the attacks was to save Europe from multiculturalism.
In other words, when people commit violent acts (like mass murders), many others often assume mental illness was involved. For the most part, we are unable to imagine the rationale for such acts and so we explain it to ourselves by presuming the killer must be insane. So, if someone commits mass murders, the armchair observer often “diagnoses” the killer with mental illness and/or psychosis. While it may make intuitive sense (e.g., “No one in their right mind would do that….”), it is often, nonetheless, inaccurate.
That is where the forensic examiner enters the scene to see if the level of thought disturbance meets the legal bar for murder driven by delusions. The field of forensic evaluation is very complicated and there are specific rules about the height of the bar over which one must leap (in very technical terms) in order to be declared incompetent to stand trial or to be found competent to stand trial but ultimately tried and found not guilty by reason of insanity or guilty but mentally ill.
When a forensic evaluator adjudges a defendant not legally responsible for having performed an unthinkable act (such as killing one’s family, child, or a group of random strangers), there are generally delusional beliefs (e.g., “I thought my mother was the devil”) driving the behavior. And there are strict definitions for what constitutes a delusional belief (see the DSM-5 diagnostic manual’s criteria here). So today’s researchers use the example of that well-covered mass murder in Norway to explain the killings were not driven by delusional beliefs (the legal bar) but rather, by non-psychotic “extremely over-valued beliefs”.
They define that new term by quoting the work of another author (McHugh in 1998) and say that extreme over-valued beliefs are typically accompanied by fanaticism:
An extreme over-valued belief is one that is shared by others in a persons’s cultural, religious, or subcultural group. The belief is often relished, amplified, and defended by the possessor of the belief and should be differentiated from a delusion or obsession. The idea fulminates in the mind of the individual, growing more dominant over time, more refined, and more resistant to challenge. The individual has an intense emotional commitment to the belief and may carry out violent behavior in its service. It is usually associated with an abnormal personality.
So one with an extreme over-valued belief may still commit very violent acts “in service” of that belief but they would not meet criteria for psychosis and would clearly understand what they were doing was wrong. From a legal perspective, they would be potentially guilty and subject to punishment. The authors say that “the court ultimately had to draw a line” in the Norway case and concluded that the shooter’s beliefs were “neither bizarre or delusional” and noted “the evaluators who opined that he was not criminally responsible should have consulted experts on right-wing ideologies before concluding that his grandeur was culturally implausible”.
In short, having extremely weird or bizarre beliefs is not the same as being mentally incompetent. This is a distinction worth keeping in mind during election years…
The authors (three prominent psychiatrists) say that “extremely over-valued beliefs” are going to be rigidly held (like delusions) but will be non-delusional. They close with two uncommonly clear sentences summarizing why they see this contribution as important.
The fact that a defendant committed a crime because of a delusional belief is a common basis for an insanity defense. It is therefore critically important that forensic psychiatrists properly identify a defendant’s belief as either a delusion or as an extreme over-valued belief.
From a litigation advocacy perspective, the takeaway for the prosecutor is that if a person’s behavior is driven by delusions, they may be successfully treated with medication. There is no medication that will help with the intractable extreme over-valued beliefs. The defendant is thus a potential danger to society since these beliefs are just as intractable as psychotic delusions.
While the distinction is a good one of which to be aware—the reality is that juries may well see the organized, plotting, and planning (probably psychopathic) predator with the “extremely over-valued beliefs” as potentially more dangerous than the mentally ill individual whose delusions will stop driving behavior when properly medicated. It makes sense for forensic examiners to be capable of differentiating between delusions and over-valued beliefs but for the layperson juror—these are just “two very scary” defendants and it’s likely they will want them both locked up.
Rahman T, Resnick PJ, & Harry B (2016). Anders Breivik: Extreme Beliefs Mistaken for Psychosis. The Journal of the American Academy of Psychiatry and the Law, 44 (1), 28-35 PMID: 26944741
In a word, maybe. Apparently, it all depends on whether your focus is on differences between you and others or similarities when it comes to genetic makeup. The researchers had Jewish and Arab participants read a new articles which (naturally) cited a scientific article reporting either high genetic similarities or high genetic differences between Jews and Arabs.
The findings were consistent: if you read the article saying Jews and Arabs were genetically very different—you were going to describe the other group as more violent, unfriendly, and just plain mean than you would if you had just read the article saying Jews and Arabs were genetically quite similar.
In a follow-up experiment, Jewish participants were told (you know the researchers lied about this) they were playing a computerized game against an Arab opponent in the next room. The winner of the game could give their opponent a loud blast of noise (and, importantly, they could turn the noise up to the volume of a fire alarm).
Those Jewish participants who had read about genetic differences blasted the room allegedly containing their Arab opponent with more intense noise blasts than did the Jewish participants who’d read about genetic similarities.
Finally, a third study asked Jewish participants (again, either reading the news article stressing similarity or the article stressing differences in genetic makeup between the two groups) to rate their support for making peace with Palestinians.
This time they found that those who had read about genetic similarities were more supportive of conflict resolution.
So, next the researchers took their study on the road to Israel (obviously, a site of conflict between Jews and Arabs). With Israeli Jews, genetic differences were important.
Jewish Israeli’s (surveyed on Israeli commuter trains) were more supportive of violent, war-like policies towards Palestinian’s after reading about their own genetic differences from Arabs.
The authors think this is powerful (and very scary) information for us to incorporate. They say most DNA ancestry test results don’t disclose overlaps with other ancestry populations and this appears to have the potential to potentially result in more negative views of those that are different from us. Two of the four authors write a summary of their results for Scientific American which concludes this way:
We suggest that DNA ancestry services remind us that our ancestry results are actually based on much less than 0.1 percent of our genes. We also suggest that organizations like International Crisis Group and Genocide Watch pay particular attention when propaganda highlights warring groups’ genetic dissimilarities. The popular media should also be highly cautious when reporting on groups’ genetics. News and magazine articles are frequently reporting on the degree of DNA overlap between groups with a history of conflict—Hutu and Tutsi, Jews and Arabs, White Europeans and Roma, Russians and Ukrainians, English and Irish—yet rarely make clear that there is in fact no genetic basis for race. [Note: For more on this line of thought, see this thought-provoking piece at the Huffington Post.]
Until then, when encountering information about how our DNA is different from other populations, we must remind ourselves that these variations are in fact minuscule. If we fail to, it can have drastic consequences.
From a litigation advocacy perspective, the tendency for people everywhere to draw greater distinctions than truly exist between “us” and “them” is why we recommend the use of “universal values” to help very different jurors in your specific venue see the defendant as more similar to them than different from them. We’ve blogged about the importance of building universal values into case themes and witness testimony often, and encourage you to read through those recommendations. This research says it may be even more important than you imagine.
Kimel, S., Huesmann, R., Kunst, J., & Halperin, E. (2016). Living in a Genetic World: How Learning About Interethnic Genetic Similarities and Differences Affects Peace and Conflict Personality and Social Psychology Bulletin, 42 (5), 688-700 DOI: 10.1177/0146167216642196
Earlier this week we wrote a post about how to invoke morality as a persuasive strategy with your jurors. Now Gallup has helped us by identifying the moral values most Americans agree on and the five about which they most disagree.
Gallup measures views on moral issues each year (since 2001) as part of their tracking of attitude shifts on social issues. They assign respondents to one of five religious groups (e.g., No religion, Jewish, Catholic, Protestant, Mormon) and then measure their attitudes on various social issues to determine what they see as moral and not moral. True, it is not a complete religious typology, but it is an interesting start.
They vary a bit from their typical single (annual) survey presentations by combining all their data from 2001 through 2016: “Results for this Gallup poll are based on combined telephone interviews in Gallup’s 2001 through 2016 annual Values and Beliefs poll, conducted each May with random samples of U.S. adults, aged 18 and older, living in all 50 U.S. states and the District of Columbia”. This gives them a total sample size of 16,754 Americans opining on moral issues.
Here are the moral issues which most religious groups in the US generally agree are either “morally acceptable” or “not morally acceptable”:
Divorce, death penalty, wearing clothing made of animal fur, medical testing on animals—are all viewed as morally acceptable with more than 50% of respondents agreeing.
On the other hand, suicide, cloning humans, polygamy, and extramarital affairs are seen as not morally acceptable (again, as measured by less than 50% of Americans surveyed agreeing they were morally acceptable behaviors).
And here are the moral issues which religious groups in the US generally disagree on (that is, some see them as acceptable and but the majority do not):
Abortion, doctor-assisted suicide, cloning animals, gay-lesbian relations, having a baby outside marriage.
We’d consider these five to be “hot button issues” which may make jurors close their minds to the facts of your case rather than considering the circumstances involved. Intriguingly, one of the religious groups measured (the Mormons) was distinctly different when it came to their views on premarital sex, stem-cell research, and gambling.
Mormons are more likely than other religious groups to view stem cell research negatively by a slight margin (54%). They see premarital sex as clearly morally unacceptable (71%) and gambling is viewed askance as well (with 63% saying gambling is morally unacceptable).
While it is important to stay abreast of research pointing toward new litigation advocacy strategies like our post on “making it moral”, it is also important to keep up with changing attitudes toward social issues and how religious beliefs and affiliations may result in differing attitudes from the norm. Know your venue, know your jurors, and keep up to date as societal attitudes shift and sway.
Sometimes we find articles we want to blog about almost immediately and other times we go through a lot of reading to identify something appropriate for a post. But along the way we almost always have tidbits we thought intriguing, resonant of a past post or series of posts, esoteric, or just plain weird. When we pull together enough of them for a post of assorted “conversation starters”, you know we’ve been reading a lot more than we’re posting!
Calm down, you are not addicted to your smartphone!
You simply have an anxious attachment style. The BPS Research Digest returns to a topic we’ve covered here before called nomophobia—which describes the anxiety experienced when we have no cell phone in our possession. They describe research completed in Hungary which says that everyone would experience anxiety over not having a cell phone—it is just expected in today’s society. The researchers say, that we should think of our relationship with our phones in terms of attachment theory. They suggest that anyone who has a fear of abandonment (an attachment issue) in their human relationships is likely going to be more anxious about being separated from their phone as well—it’s just an anxious attachment style. You were a worrier before, so you also worry about not having your phone. You feel better now, right?
When DNA implicates the innocent & Eye witness identification errors
In the event you missed them, Scientific American has had really good articles on the legal system recently. Don’t miss this article highlighting times when DNA is very, very wrong or this one on how level of certainty in eyewitnesses can improve the efficacy of police lineups. Both are worthy of your time to read.
How to sound charismatic
We’ve written about deep voices and how appealing they can be and now here is an article from the Atlantic dissecting how politicians vary their voice pitch and tone during speaking engagements in order to appeal to the widest audience possible. It’s disturbing.
We’ve written a lot about other kinds of self-appointed experts on your jury (and how to dethrone them) but today’s work is a reflection of another aspect of perceived expert status.
When you think you already know a lot about something, you can become closed-minded. You finish the testimony before the witness does. A closed mind is a problem everywhere, but in a jury room it is dangerous.
We’ve seen this a lot in pretrial research (like this post about a retired teacher named ‘Victoria’) but today’s research tells us that when you see yourself as a relative expert on an issue—you are less likely to be open to other information and/or opinions.
It’s an assumption that is somewhat counter-intuitive since “real experts” need to be open to new information in order to remain “experts” as new knowledge is identified. Yet, these “self-appointed” experts, became quite dogmatic across all six experiments the researchers conducted. The researchers label this tendency the “earned dogmatism effect”—likely a close relative of the Dunning -Kruger effect.
A relatively easy example is when someone (for example, a doctor or a nurse in a personal injury case) is required to set aside their professional knowledge and rely solely on the testimony offered in trial. Their training and experience is not evidence, so if they believe something to be true that is inconsistent with the evidence, they are to rule out their experience, not the evidence.
Of course, humans rarely can do that. Typically, such actual ‘experts’ are stricken from the jury. The greater problem are informal ‘experts’, who think that because they can fix cars they know why a jet engine failed, or because they are married to a bookkeeper they understand the nuances of complex tax fraud. These informal experts are often much more difficult to identify, especially in courts where attorney voir dire is limited or prohibited.
From a litigation advocacy perspective, you want jurors to be listening to new information you are presenting and we’d encourage you to review our earlier posts on how to maximize the chances of that happening and how to teach jurors to disrupt this self-appointed expert during deliberations. Self-appointed experts can range from retired schoolteachers like Victoria to shade tree mechanics and everything in between—you often don’t know they are there until they make themselves known verbally.
Ottati, V., Price, E., Wilson, C., & Sumaktoyo, N. (2015). When self-perceptions of expertise increase closed-minded cognition: The earned dogmatism effect. Journal of Experimental Social Psychology, 61, 131-138 DOI: 10.1016/j.jesp.2015.08.003