Archive for the ‘Forensic evidence’ Category
A recent symposium for IT executives included a presentation that pitched the idea of genetic screening of job applicants for traits like “honesty, leadership, being a team player, and having a high level of emotional intelligence”. While we think you may want to hang onto your checkbook if offered this sort of service, it is a disturbing outgrowth of the burgeoning research into genetic testing for almost everything. Here is a quote from the Seeker website which brought this possibility to our attention:
Although federal and states laws prohibit employers from requesting or using an employee’s genetic information, genetic testing is mainstream. Millions of people voluntarily pay to have their genomes analyzed thanks to inexpensive DNA kits available from companies like Ancestry DNA , Genome , 23andMe, Family Tree, to name a few. And research is moving forward in fields such as psychiatric genetics, trying to find correlations between genes and behavior.
“We fully appreciated the lack of legality and some of the issues with the science,” Furlonger told Seeker by email. “Nonetheless, it seems clear that work is being undertaken and therefore the current state should not be ignored.”
We are glad they appreciate the “lack of legality”. (Some researchers do not acknowledge the legal concerns—like this group on how to hire the “good psychopath” by testing them pre-hire.) The actual best answer to this question is that there is no gene for leadership (or honesty, or being a team player, or having high emotional intelligence) and there is no way testing of this sort would be useful to a company trying to figure out who to hire.
Neurolaw researchers (like Hank Greeley) are speaking up against this strategy:
“Why would an employer rely on imperfect, and generally weak, associations between genes and test scores instead of relying directly on the test scores?” said Henry Greely, director of the Center for Law and the Biosciences at Stanford University and the chair of steering committee of the Center for Biomedical Ethics. It’s like running, he said. Rather than look for genetic variations that indicate whether someone is a good sprinter or not, just watch a person sprint. That ought to tell you all you need to know.
We agree and are glad to have voices of reason speaking out against the desire to “push the hiring envelope” into areas that make no sense and violate medical privacy (as well as statistical integrity). Because while the genetic testing can’t tell you anything about the purported target traits, they can tell you things about the person that should not be a factor in hiring (including gender, possibly ethnicity, and medical issues). Will genetic testing results be a tool to worsen the problems of women and non-Asian minorities in breaking into STEM fields? Here’s what we wrote in August 2016 when we came across the “good psychopath” workplace fit test. We think it works for this idea too.
From a law office management perspective, we really would urge rejecting this sort of strategy. What they seem to intimate is that you want to find the 10% of the psychopathic population who have moderate psychopathic tendencies and then, divide them into primary and secondary psychopaths and then, figure out which of the primary psychopaths have really good social skills so their behaviors will not wreak havoc in your workplace.
Putting on our duly licensed Psychologist hats for a moment, the distinction seems to be a very slippery slope. Secondary psychopaths are trouble from the beginning. Primary psychopaths have better social skills so they can manage the day-to-day more successfully, but under stress they are going to create havoc, too. And we have never seen a trial team that isn’t under terrific stress. It is the nature of litigation, and stress tolerances need to be higher than average, not a potential area of weakness.
The authors put a troubling amount of faith in a psychological trait scale, when you can assess the same things by looking at work history, length of relationships, and having your own warning signs on high alert during the interview process. Use your intuition about whether someone will be a good fit. It is also risky to assume you can “get around” the Americans with Disabilities Act by using the PPI-R scale with job applicants when what you are measuring is psychopathy and resulting goodness of fit in your workplace.
And a high-functioning psychopathic attorney is just the kind of person to drag you through a lawsuit by claiming that you rejected him or her based on an ADA protected factor.
We’ve written about CRISPR (aka gene editing) before and even about concerns of Americans about use of emerging technologies, and while this post is sort of about CRISPR—it is also about visual evidence done right.
We often work on cases where jurors will need to understand very complex information. It may be a patent case or a complex business litigation case or something else that is technically daunting—but jurors often need to understand something very complicated. And often that something is very technologically advanced (and thus intimidating to the jurors).
It is almost always a very difficult process for the attorneys in a complex case (in which they have often been buried for years) to see through the many details of a complicated technology and tell a simple (yet accurate) story for jurors. We often test visual evidence in our pretrial research to see what resonates with jurors, what they remember, and what helps them to make sense of abstract and esoteric technology, processes, or patented ideas.
When we see terrific examples of visual evidence (culled from many different areas) we like to share them here to help you understand there really is a way to take very, very complex facts and details and make them accessible to those who have no experience whatsoever in the area and may be very intimidated by even attempting to understand the information.
Here is just such a video tutorial. This video uses cartoon images and plain language to explain the gene editing technique referred to as CRISPR. While the last parts of the video place it clearly in the pro-CRISPR camp, the first parts explain the technology clearly and succinctly. Because it is in a cartoon format (with which we are all familiar from childhood) it is non-threatening. Since it is visually presented, we are able to understand a tremendous amount of technical information without jargon or numbers that make less technical viewers’ eyes glaze over.
If CRISPR can be explained in a few minutes of cartoons, you can explain anything in ways the most naïve juror can understand. All you need is a fabulous visual evidence consultant. We happen to know a few of them!
This will shock you, or maybe relieve you: Psychopaths are different from the rest of us. Here’s another article saying there are measurable differences in how the brains of how criminal psychopaths work (and look) when compared to non-criminal psychopaths (those who have psychopathic traits but have not been convicted of criminal offenses) and non-psychopaths.
While many criminal offenders have psychopathic traits, there are some psychopaths who never commit offenses (at least, for which they are convicted). Today’s researchers wanted to see if there were “brain differences” visible on an MRI. They tested 14 convicted psychopaths and 20 non-criminals—half of whom who had a high psychopathy scale score but had not been convicted of any offenses. This is a very small group size but as they comment—it is the first time convicted offenders have actually been examined.
They found a few differences and the following is a summary of their findings:
Psychopaths (both criminal and non-criminal) have stronger reward centers in their brains
To clarify, the brain’s reward center—called the nucleus accumbens—“is responsible for recognizing and processing the rewards and punishments that follow from our actions”. The researchers had participants perform various tests while in an MRI scanner to measure brain activity. Those who had no significant psychopathic traits had a weaker response in the brain’s reward center than did both the criminal and non-criminal psychopaths.
Low self-control and less response to reward in criminal compared to non-criminal psychopaths
Good communication between the reward center of the brain and an area in the mid-brain is seen as reflecting good self-control. The authors found that criminal psychopaths did not have as good communication between those brain areas as did non-criminal psychopaths. While this is the first time criminal psychopaths were actually examined in this way (and there were only 14 of them) the researchers think it possible that the tendency to commit a criminal offense stems from a combination of a lack of responsiveness to reward and a lack of self-control.
Among the other lessons learned was a sense that when your reward center is extremely sensitive, you may be more likely to behave impulsively. The researchers think a sensitive reward center may be more predictive than a lack of empathy but obviously follow-up studies are needed. They also think that if future studies continue to show the brain plays an important role in criminal behavior—we may yet see brain scans being used in forensic examinations for diminished responsibility down the road.
While neurolaw advances are not being published as quickly as they were for a while, there are still multiple researchers working on the question of responsibility for criminal acts when your brain is demonstrably different from a non-psychopath. This is an interesting line of research in terms of comparing criminal psychopaths to non-criminal psychopaths and non-psychopaths. The small sample size is a concern and we need to wait for larger samples but the ideas are ones we think likely to continue to spark new research until we have to deal with these questions of responsibility in the courtroom. We’ve written about this area frequently so if you’d like to see what our mock jurors say in pretrial research, take a look at the neurolaw category in our blog.
Geurts DE, von Borries K, Volman I, Bulten BH, Cools R, & Verkes RJ (2016). Neural connectivity during reward expectation dissociates psychopathic criminals from non-criminal individuals with high impulsive/antisocial psychopathic traits. Social Cognitive and Affective Neuroscience, 11 (8), 1326-1334 PMID: 27217111
It’s time for another installment of strange tidbits we’ve gathered as we have read potential articles for blog posts. This week we have information on why you would stick something icky and repulsive into your mouth, online anonymity, bias against homosexuals, and what horrible things can happen should you choose to ‘unfriend’ that person on Facebook who really annoys you.
Disgusting and repulsive is what that is—tell me more!
The popularity of television shows like Fear Factor tells us that we humans are drawn to disgusting and repulsive things. Some researchers (Hsee and Ruan cited below) think our curiosity drives us to risk negative outcomes (much like Pandora). There is a thorough write-up on this article over at Scientific American that is worth your time to review—although it is likely a good idea to not eat while doing so.
You are likely not as anonymous online as you think
Now this is sort of scary. Many of us want to be anonymous online as we go about our daily business. But a new research study says they can identify who you are just by the way you browse the internet. Apparently, each of us creates a “unique digital behavioral signature” and “they” can know way too much about you based on how you wield that electronic mouse or touchpad. Within a half hour of monitoring you, the researchers say they can measure personality characteristics like “openness to new experiences, conscientiousness, extraversion, agreeableness and neuroticism”. That’s pretty scary. The researchers appear to be very excited about this and appear to long to sell their strategies to online marketers. [I think these researchers should be denied tenure just on principle.]
How do we feel now about lesbian women and gay men?
There has been a cultural shift underway in the US in attitudes toward homosexuals. Some have wondered if there really is a change underway or if people just feel pressured to express more support for gay men and lesbian women. Now there is research published in a new open access journal called Collabra that says this societal change really has occurred. A team of researchers found that implicit or unconscious bias against lesbians and gays was down 13% in 2013 when compared to 2006. Nearly all demographic groups showed decreases in bias against homosexuals over that 7 year period which suggests the change is not just politically correct but actually real.
You may want to consider alternatives to “unfriending” on Facebook once you read this
Imagine you live in a “sleepy mountain town” with your young spouse and infant child. Then imagine you have been murdered (although your child survived) and no one can figure out who did it because, “everyone” liked you. You don’t really have to imagine since you can read the story of what happened to a young couple after they ‘unfriended’ a woman on Facebook. It’s a sadly bizarre tale of catfishing and loneliness and perhaps some psychopathy. Here’s a quote from the assistant district attorney’s opening statement to the jury:
“This is going to be the stupidest thing you’ve ever heard. This is going to be the craziest thing you’ve ever heard. There is nothing in your lives or background that has prepared you to understand the Potter family.”
And to that we say, “Amen”. And we would like also to mention you can ‘unfollow’ rather than ‘unfriend’ to get them out of your timeline but not incite homicidal rage.
Hsee CK, & Ruan B (2016). The Pandora Effect: The Power and Peril of Curiosity. Psychological Science, 27 (5), 659-66 PMID: 27000178
I first heard the term “over-valued belief” back in the mid-1990’s when I worked in forensic rehabilitation with a man adjudicated not guilty by reason of insanity. He had been very ill (psychotic) and very violent when unmedicated (and had killed more than once due to delusional beliefs) but had been in treatment and well-medicated for years when I met him.
One day he confided that he had been late for our treatment group because he couldn’t stop flushing the toilets on his ward. Later I asked him what he meant and he explained that when the State Legislature was in session and voting on bills, he felt he could also “vote” and perhaps sway their opinions. If he flushed the toilet at the right end of the group bathroom it was a vote for the Republican opinion and if he flushed a toilet at the left end of the group bathroom it was a vote for the Democrat perspective.
I asked him if the strategy worked and he grinned at me—“If I thought it worked, it would be a delusion and I am not delusional anymore. It’s just an over-valued belief at this point”. When I persisted by tilting my head and looking curious, he grinned more widely—“At this point, I can’t stop myself from doing it sometimes “just in case” but it only happens with bills that are really important”.
That lesson stuck with me so when I saw this article on the importance of defining the difference between a delusional belief and an over-valued idea—I knew it would end up as a blog post. It’s a good distinction to be aware of and perhaps especially important for those working on the criminal justice system.
In the aftermath of violent acts such as mass shootings, many people assume mental illness is the cause. After studying the 2011 case of Norwegian mass murderer Anders Breivik, University of Missouri School of Medicine researchers are suggesting a new forensic term to classify non-psychotic behavior that leads to criminal acts of violence.
“When these types of tragedies occur, we question the reason behind them,” said Tahir Rahman, M.D., an assistant professor of psychiatry at the MU School of Medicine and lead author of the study. “Sometimes people think that violent actions must be the byproduct of psychotic mental illness, but this is not always the case. Our study of the Breivik case was meant to explain how extreme beliefs can be mistaken for psychosis, and to suggest a new legal term that clearly defines this behavior.”
Breivik, a Norwegian terrorist, killed 77 people on July 22, 2011, in a car bombing in Oslo and a mass shooting at a youth camp on the island of Utøya in Norway. Claiming to be a “Knights Templar” and a “savior of Christianity,” Breivik stated that the purpose of the attacks was to save Europe from multiculturalism.
In other words, when people commit violent acts (like mass murders), many others often assume mental illness was involved. For the most part, we are unable to imagine the rationale for such acts and so we explain it to ourselves by presuming the killer must be insane. So, if someone commits mass murders, the armchair observer often “diagnoses” the killer with mental illness and/or psychosis. While it may make intuitive sense (e.g., “No one in their right mind would do that….”), it is often, nonetheless, inaccurate.
That is where the forensic examiner enters the scene to see if the level of thought disturbance meets the legal bar for murder driven by delusions. The field of forensic evaluation is very complicated and there are specific rules about the height of the bar over which one must leap (in very technical terms) in order to be declared incompetent to stand trial or to be found competent to stand trial but ultimately tried and found not guilty by reason of insanity or guilty but mentally ill.
When a forensic evaluator adjudges a defendant not legally responsible for having performed an unthinkable act (such as killing one’s family, child, or a group of random strangers), there are generally delusional beliefs (e.g., “I thought my mother was the devil”) driving the behavior. And there are strict definitions for what constitutes a delusional belief (see the DSM-5 diagnostic manual’s criteria here). So today’s researchers use the example of that well-covered mass murder in Norway to explain the killings were not driven by delusional beliefs (the legal bar) but rather, by non-psychotic “extremely over-valued beliefs”.
They define that new term by quoting the work of another author (McHugh in 1998) and say that extreme over-valued beliefs are typically accompanied by fanaticism:
An extreme over-valued belief is one that is shared by others in a persons’s cultural, religious, or subcultural group. The belief is often relished, amplified, and defended by the possessor of the belief and should be differentiated from a delusion or obsession. The idea fulminates in the mind of the individual, growing more dominant over time, more refined, and more resistant to challenge. The individual has an intense emotional commitment to the belief and may carry out violent behavior in its service. It is usually associated with an abnormal personality.
So one with an extreme over-valued belief may still commit very violent acts “in service” of that belief but they would not meet criteria for psychosis and would clearly understand what they were doing was wrong. From a legal perspective, they would be potentially guilty and subject to punishment. The authors say that “the court ultimately had to draw a line” in the Norway case and concluded that the shooter’s beliefs were “neither bizarre or delusional” and noted “the evaluators who opined that he was not criminally responsible should have consulted experts on right-wing ideologies before concluding that his grandeur was culturally implausible”.
In short, having extremely weird or bizarre beliefs is not the same as being mentally incompetent. This is a distinction worth keeping in mind during election years…
The authors (three prominent psychiatrists) say that “extremely over-valued beliefs” are going to be rigidly held (like delusions) but will be non-delusional. They close with two uncommonly clear sentences summarizing why they see this contribution as important.
The fact that a defendant committed a crime because of a delusional belief is a common basis for an insanity defense. It is therefore critically important that forensic psychiatrists properly identify a defendant’s belief as either a delusion or as an extreme over-valued belief.
From a litigation advocacy perspective, the takeaway for the prosecutor is that if a person’s behavior is driven by delusions, they may be successfully treated with medication. There is no medication that will help with the intractable extreme over-valued beliefs. The defendant is thus a potential danger to society since these beliefs are just as intractable as psychotic delusions.
While the distinction is a good one of which to be aware—the reality is that juries may well see the organized, plotting, and planning (probably psychopathic) predator with the “extremely over-valued beliefs” as potentially more dangerous than the mentally ill individual whose delusions will stop driving behavior when properly medicated. It makes sense for forensic examiners to be capable of differentiating between delusions and over-valued beliefs but for the layperson juror—these are just “two very scary” defendants and it’s likely they will want them both locked up.
Rahman T, Resnick PJ, & Harry B (2016). Anders Breivik: Extreme Beliefs Mistaken for Psychosis. The Journal of the American Academy of Psychiatry and the Law, 44 (1), 28-35 PMID: 26944741