Archive for the ‘Generation or Age of Juror’ Category
FALSE! Alas, even though Microsoft has popularized this notion of a shrinking attention span—it is simply not true. Or at least, there is no proof it is true. And the study the falsehood was based on was not even looking at attention span—it was looking at multi-tasking while browsing the web. To add insult to injury for the authors (who actually are academics), they do not even use the word goldfish in their article. Academics who’ve been misquoted or misinterpreted by the media are shaking their heads around the globe. This distorting of research by the popular press for the sake of sensational stories isn’t new, but for those who do the work, it is pretty disturbing. Reporters often do little back-checking with the geeks that make the world go ‘round, because it’s hard, and it often takes the edge out of a catchy story. Once the first misinterpretation is published, the skewed reports drift farther and farther from the research they purportedly rely on. Alas…
Okay. So what happened here? Microsoft apparently commissioned a 2015 non-peer-reviewed study to examine how internet browsing had changed over time—that is, how long do surfers look at a page prior to moving on? Then it was misinterpreted (really misinterpreted) with spurious comparison information added about how adult attention spans were shrinking—an assertion unsupported and unaddressed even by the Microsoft study. This misinformation was picked up by the New York Times and Time Magazine as well as numerous other mainstream media sites. Each site represented the data as a scientific truth stemming from a paper commissioned by Microsoft. The only problem was, it wasn’t true.
The table following is another example of how the work was misinterpreted—it misrepresents the human (and goldfish) attention span as the real focus of the paper, which could barely be farther from the truth. The last half of the below table (Internet Browsing Statistics) is actually taken from the article Microsoft commissioned to look at how browsing patterns on the internet have changed over time. The top half however (Attention Span Statistics) is not and is totally unrelated to the study they commissioned. And, none of it has been validated or otherwise proved to mean anything at all.
(If you have trouble reading this table, here is the original source.)
You can find the text of the complete article commissioned by Microsoft here. Open it as a pdf file and search it for “goldfish”. You won’t find it. Nada. The study was not designed to look at the human attention span nor was it designed to compare human attention spans to that of a goldfish. It was designed to look at how advances in web technology had changed how we surf the web. Because, Microsoft wants to figure out how to make the most out of web surfing.
We are fortunate to have fact-checkers on the web — particularly when it comes to topics like data visualization. PolicyViz does a thorough job of debunking this myth as does a writer posting on LinkedIn. They both want everyone to STOP comparing people to goldfish! We would concur. We would also love to see people using their common sense and questioning sensational claims–“the average attention span of a goldfish”? Really? Or, what is the significance of any of those memory lapse statistics? Has that always been the case? Is it different? Why should we care?
From a litigation advocacy perspective, there are two key lessons here: First, pay no attention to comparisons of your jurors to goldfish. Instead use things like chunking your information into 10 minute segments—that factoid is actually supported by research on learning and not just drummed up by a marketing representative. If jurors do not pay attention, it likely isn’t their declining attention spans, but rather that your presentation did not speak to their values, attitudes and beliefs. Test your presentations pretrial and make sure real people pay attention and understand.
And second, be very aware of how easily seduced people are by unproven, but juicy, factoids based on data that is unproven or false, just because it is amusing or it seems to support some preexisting but uninformed suspicion. Cleverness often sells.
Pew Research often comes up with data-based pictures that tell us things we may have known, but in a very visual way. Most of us have heard that minorities in the US will outnumber whites before long but this is a very clear depiction of how that is happening.
When you look at the ages of everyone in the US in 2015, there are more 24-year-olds than any other age. But—if you only look at white Americans, 55 was the most common age according to Pew’s review of US Government Census Bureau data. The graphic (one of several in their report) shows a comparison of white people and minority group members by age (in the US in 2015).
You can see by just looking at this graphic that as the Millennial generation ages and are replaced by the Post-Millennial generation (which has yet to receive a moniker although Neil Howe is trying to popularize Homeland Generation as a label for this upcoming group). In fact, Pew says that those under age 5 are already a “majority minority” although only by a small margin.
There are multiple facts in this brief report worth reading. Here are a few of them:
In 2015, more than half (56%) of minorities were Millennials or younger.
Americans identifying with two or more races were the youngest group (with a median age of 19 years) in the Census Bureau data. And, almost half (46%) of multiracial Americans were between the ages of 0 and 17 years (meaning they were not yet part of any named American generation).
In 2015, the relative youth of Hispanics was driven by the US-born Latino population—nearly 3/4 of whom are Millennials or younger.
Asians grew the fastest of all ethnic or racial groups in the US in 2015. The majority of Asians were Millennials (27%) and Gen Xers (25%)—so older than other minorities but younger than whites.
About half of blacks were Millennials (26%) or younger (25%) in 2015.
Take the time to read this report from Pew—your potential jurors are diversifying.
Pew Research Center, July 7, 2016. Biggest share of whites in U.S. are Boomers, but for minority groups it’s Millennials or younger. http://www.pewresearch.org/fact-tank/2016/07/07/biggest-share-of-whites-in-u-s-are-boomers-but-for-minority-groups-its-millennials-or-younger/
Sometimes these tidbit posts come around more often than usual—typically it happens when we’ve read a lot that is just not suited for an entire blog post but it made us laugh out loud or peaked our curiosity. Here for you are the last few things that made us look again or laugh uncomfortably.
Millennials are doing job search duties for their parents too
We hear so much bad press on the Millennials but here’s a really sweet article that shows how Millennials are helping out their parents too. The Atlantic has an article on what they describe as “employed and financially independent Millennials who are instead helping their parents find a job”. They are not only teaching their parents the basics of finding a job in 2016 but also using their social media skills and networking skills to find out who might be hiring in their parent’s professional areas of interest. It’s an uplifting and positive take on a generation currently maligned as freeloaders—plus there are some good resources embedded in the article as Millennials talk about what they have done to help Mom or Dad.
Political extremists are less susceptible to the anchoring bias
So here’s a point for the political extremist. If you are a regular reader here you know we tend to de-select the political extremist as just too unpredictable to serve as a juror on most cases. Often, the extremist is characterized as unthinking and knee-jerk in decision-making with stereotypes and biases guiding their thinking. However, new research tells us that political extremists sometimes think carefully about their decisions and are quite confident in their judgments. Here’s the abstract for the article, a blog post by the first author, and a Huffington Post writeup. The complete reference is at the end of this post. It’s an interesting article but we still won’t be choosing them to sit on the vast majority of juries.
Those Joe Jamail deposition tapes are so 1990
It’s been years since we first saw the “Texas style” depositions by Joe Jamail on YouTube. If you have somehow missed watching this epic video, you owe it to yourself to give it a look. You’ll realize just how long it’s been when you see these courtroom transcripts posted by Keith Lee over on Associate’s Mind blog. It’s enough to make one wonder how court reporters maintain their decorum and it certainly says something about how times change. Both of the authors of this fine blog have testified many times as expert witnesses. And one memory stands out prominently in which two lawyers nearly began brawling in the middle of the deposition. It was a good time to have a psychologist and dispute resolution specialist in the room!
What’s the best way to deliver bad news?
When companies downsize (or “right-size”) there are always myriad recommendations on the best way to deliver bad news to those who lose their jobs due to layoffs. Now, new research tells us it isn’t so much what is said when you notify an employee about layoffs as how it is said. Researchers publishing in the Journal of Applied Psychology tell us that when employees are given the information in a way that seems fair to them—their reactions are much less negative. According to them, “fairness” includes process transparency and treating employees with respect. You can read a summary of this article over at Science Daily. and we found a full-text source here.
Brandt MJ, Evans AM, & Crawford JT (2015). The unthinking or confident extremist? Political extremists are more likely than moderates to reject experimenter-generated anchors. Psychological Science, 26 (2), 189-202 PMID: 25512050
If you are seeking empathy and understanding from jurors hearing your case—go for middle-aged adults—and, in particular, middle-aged women. If you are thinking the sample size of this study cannot possibly be large enough to draw that sort of conclusion—think again! This is a study based on 75,263 adults in the US.
In the study, late middle-aged adults said they were more likely to react emotionally to the experiences of others and that they were also more likely to try to understand how things looked from the perspective of others. Both men and women “of a certain age” were more likely to report higher empathy but women were especially likely to do so. (And in case, like us, you are finding it more difficult to ascertain just when “late middle age” might be—the researchers define this as someplace between 50 and 60 years of age.)
Basically, the researchers examined responses from the General Social Survey which measured empathy in both 2002 and 2004. And surprisingly, these were the two smallest samples (1,353 adults in 2002 and 1,330 in 2004). Additionally, the authors conducted an online survey of 72,580 US adults between 18 and 90 years of age wherein they measured both empathy and perspective taking. (Note: While the GSS surveys are random and nationally representative, the researchers large online sample is not.)
Here is what they found on empathy:
Women reported higher empathy than men in all three samples.
In 2002, the GSS sample showed no significant differences in empathy based on ethnicity. In 2004, African-Americans had lower empathic concern than European-Americans. And in the online survey—African-Americans, Asian Americans and “especially Hispanic Americans” reported higher empathic concern than European Americans. (The authors make a point of stressing that the effects were fairly small.)
The effects of age were consistent across all three samples. Empathic concern was higher in older than in younger adults. The most common interpretation of this is that younger jurors haven’t experienced enough pain and suffering to appreciate its debilitating effects.
And here is what they found on perspective taking (which is akin to empathy and basically assesses how likely you are to attempt to put yourself in the “shoes” of another). Note: perspective taking was only assessed in the online sample and not in the GSS samples.
Women had higher self-reported levels of perspective taking than did men.
European Americans had lower perspective-taking than those of other ethnic origins (this effect was small).
And older adults had higher perspective taking than younger adults.
The researchers explain their results in clear and easy-to-understand language. “Specifically, empathy was expected to show an inverse-U-shaped function across the adult life span, with middle-aged adults scoring higher than young adults and older adults. Indeed, we found empirical evidence for this pattern in the case of both empathic concern and perspective taking in all three samples.”
For the non-statisticians among you, what that means is that both younger and older adults are less empathic and less likely to take the perspective of others than are middle-aged adults.
The researchers don’t know whether this is a true age effect or the result of generational experiences since this age range reflects younger Baby Boomers who grew up during sweeping societal changes that emphasized the feelings and perspectives of others.
From a litigation advocacy perspective, this is an intriguing study. If we know that women report higher levels of empathy than do men and we know the same pattern holds true for self-reports of perspective-taking—and, we know that empathy seems to peak between ages 50 and 60—when all else is equal—you likely would be better off choosing the woman between 50 and 60 for your jury.
As an aside, we always caution against blanket assumptions that “women are better for Plaintiffs and men are better for Defendants”. It simply is untrue. But this finding, when coupled with other information from careful pretrial research, can be instructive in voir dire and jury selection.
O’Brien E, Konrath SH, Grühn D, & Hagen AL (2013). Empathic concern and perspective taking: linear and quadratic effects of age across the adult life span. The journals of gerontology. Series B, Psychological sciences and social sciences, 68 (2), 168-75 PMID: 22865821
You know what ‘creepy’ is and in the movie The Silence of the Lambs, Anthony Hopkins personified creepiness. While it may be hard to believe, no one has ever “pinned down” what makes a person creepy. Since there must be a need for such information, enter academic Francis McAndrew of Knox University (in Galesburg, Illinois), for an impressive effort.
First he educates us on what creepiness is—as though we needed him to do that. We all know what constitutes “creepiness” and what results in us being “creeped out” but he does a pretty good job of defining it.
“Creepiness is anxiety aroused by the ambiguity of whether there is something to fear or not and/or by the ambiguity of the precise nature of the threat (e.g., sexual, physical violence, contamination, et cetera) that might be present. Such uncertainty results in a paralysis as to how one should respond.”
So in order to begin what will likely be a long academic exploration (he already has tenure!) on the topic of creepiness, he constructed a measure of just what “normal people” think is creepy. He asked 1,341 people (1,029 females and 312 males ranging in age from 18-77 with an average age of 28.97, via internet survey) to answer some questions about a hypothetical “creepy person” that a friend had encountered. He asked them to rate the person’s physical appearance, behavior and intentions on a scale from 1 (normal) to 5 (creepy). He later asked them to rate occupations and hobbies on a “creepiness scale”.
And here is some of what he found:
Participants were asked if “creepy individuals” were more often male or female. Both male and female participants thought men were more likely to be creepy.
Females were more likely to perceive a sexual threat or sexual interest from a creepy person than were males.
The creepiest occupations were: clown, taxidermist, sex shop owners, and funeral director. (Public service announcement: The full list of occupations deemed “creepy” was in the article and we carefully reviewed it. Neither attorneys nor psychologists were on the creepiness scale, although college professors were on the scale. Be careful out there.)
The creepiest hobbies were collecting things (like dolls, insects, reptiles, or body parts such as teeth, bones or fingernails); variations on ‘watching’ others, bird watchers (who knows what they are really doing?); taxidermy, and a fascination with pornography or exotic sexual activities.
Older participants had less alarm over creepy people, were less likely to feel physical or sexual threat from a creeper and had less anxiety over interacting with a creepy person.
Finally, survey participants were convinced that creepy people do not know they are creepy.
Essentially, what this research says is it is the uncertainty or ambiguity surrounding the creepy person that leads us to think they are a potential threat. It’s good for us to recognize potential threats in our environment—although that birdwatcher wariness is a little odd, unless the concern it that they are really Peeping Tom’s, and the birding interest is a transparent ruse. And it appears that is precisely what our alarm over encountering someone creepy serves to do—detect potential threats.
From a litigation advocacy perspective, this falls into the category of “be aware of the impression that witnesses create in jurors”. If you are prepping a witness and it occurs to you that “this person takes a while to warm up to”, consider what impression they created in you before the warmth took over.
If you conclude that you felt wary of them until they described X or Y, or told you a story about their family or background that you found reassuring—you might have a problem witness. Testing witnesses for credibility and likability is very worthwhile, and it can give you some ideas about how to reduce their potential for “creepiness”.
As an extra piece of information for you, here’s a video that is awkward but not really creepy (at least by the researcher’s definition).
McAndrew, F., & Koehnke, S. (2016). On the nature of creepiness. New Ideas in Psychology,, 43, 10-15 DOI: 10.1016/j.newideapsych.2016.03.003