Archive for the ‘Case Presentation’ Category
We like Pew Research here and wanted to bring you two new articles they’ve recently posted that may have relevance for knowing your jurors. It’s been a while since we’ve heard the term “boomerang generation” in regard to Millennials and maybe it’s because they are not planning to go anywhere anytime soon. Yet, if you look at the definition of “boomerang generation” now, it isn’t about moving out and moving back and moving out and moving back again, it’s about staying in place. And Pew has a new article addressing the issue.
Multigenerational households: 2016
According to Pew Research, we now have a “record 60.6 million Americans living in multigenerational households”. That translates to 1 out of every 5 Americans living in a multigenerational household (defined as two or more adult generations or a home that includes grandparents and grandchildren). Further, the trend is growing among nearly all racial groups (whites are less likely to live multigenerationally) as well as Hispanics in the US, among all age groups, and across genders.
While older adults used to be the ones most commonly living in multigenerational households, now it is young people for whom this living arrangement is most common. It is becoming more common for not just two adult generations to live together but even common for three generational groups. Pew thinks this is the result of immigrant families increasing in the country and a more frequent tendency in those cultures to share households. It is interesting to examine the graph (taken from the Pew site). The number has increased but not sharply. It is a gentle upward trend reflecting the changing demographic of America. As the nation changes, so do our housing norms.
Religious affiliations of “none”: 2016
Between 2007 (16% of those surveyed) and 2013 (23% of those surveyed), Pew Research says the number of religiously unaffiliated (aka the “nones”) grew rapidly from 35.6 million Americans to 55.8 million Americans saying they had no religious affiliation. Recently, Pew interviewed religious “nones” to see why they had left the church. Their reasons vary widely and as Pew says, the “nones” are far from monolithic. Here is the largest reason those who were raised in the church say they ended up leaving as adults:
About half of current religious “nones” who were raised in a religion (49%) indicate that a lack of belief led them to move away from religion. This includes many respondents who mention “science” as the reason they do not believe in religious teachings, including one who said “I’m a scientist now, and I don’t believe in miracles.” Others reference “common sense,” “logic” or a “lack of evidence” – or simply say they do not believe in God.
The others may have objections to organized religion, be religiously unsure, or simply inactive due to other obligations. Pew describes the “nones’ as composed of three groups:
They can be broken down into three broad subgroups: self-identified atheists, those who call themselves agnostic and people who describe their religion as “nothing in particular.”
From a litigation advocacy perspective, these findings are important. We need to realize both living arrangements and religious affiliations are changing. Some of this reflects the changing racial and ethnic makeup of the country and some of it reflects changing values and beliefs in our society. Sometimes these changes catch us off guard and other times we just think what we knew “back then” still applies today. Pay attention. Don’t be surprised when your assumptions (based on outdated information) are just wrong.
We’ve written about CRISPR (aka gene editing) before and even about concerns of Americans about use of emerging technologies, and while this post is sort of about CRISPR—it is also about visual evidence done right.
We often work on cases where jurors will need to understand very complex information. It may be a patent case or a complex business litigation case or something else that is technically daunting—but jurors often need to understand something very complicated. And often that something is very technologically advanced (and thus intimidating to the jurors).
It is almost always a very difficult process for the attorneys in a complex case (in which they have often been buried for years) to see through the many details of a complicated technology and tell a simple (yet accurate) story for jurors. We often test visual evidence in our pretrial research to see what resonates with jurors, what they remember, and what helps them to make sense of abstract and esoteric technology, processes, or patented ideas.
When we see terrific examples of visual evidence (culled from many different areas) we like to share them here to help you understand there really is a way to take very, very complex facts and details and make them accessible to those who have no experience whatsoever in the area and may be very intimidated by even attempting to understand the information.
Here is just such a video tutorial. This video uses cartoon images and plain language to explain the gene editing technique referred to as CRISPR. While the last parts of the video place it clearly in the pro-CRISPR camp, the first parts explain the technology clearly and succinctly. Because it is in a cartoon format (with which we are all familiar from childhood) it is non-threatening. Since it is visually presented, we are able to understand a tremendous amount of technical information without jargon or numbers that make less technical viewers’ eyes glaze over.
If CRISPR can be explained in a few minutes of cartoons, you can explain anything in ways the most naïve juror can understand. All you need is a fabulous visual evidence consultant. We happen to know a few of them!
New research tells us you may not want to have slow motion videos played at trial if you are the defense attorney. However, if you are the prosecutor—push hard for that video! It’s really a simple lesson: when jurors see slowed down footage of an event, they are more likely to think the person on the screen has acted deliberately (and that will likely mean a more severe sentence and/or verdict).
The researchers say that while the slowed down footage does give the observer better ability to see what happened clearly, it also creates a “false impression that the actor had more time to premeditate” then when the events are viewed in real-time. Their experiments show (among other things) that jurors seeing a crime as calculated rather than impulsive can mean the difference between a “lethal injection and a lesser sentence”. That’s a profound impact from a video.
In a series of experiments, the researchers showed participants a video of an attempted armed robbery where the shop employee was shot and killed. When participants watched the video slowed down, they were 3x more likely to convict the defendant than those who watched the video in real-time.
The researchers reference the trial of John Lewis (still on death row for murdering a police officer in 2007) and the slowed down video used at his trial. Defense attorneys argued on appeal that the slowed down video evidence at his trial made jurors think his actions were premeditated, but prosecutors said since both regular speed and slowed down video were played—jurors were not confused. The judges sided with the prosecutors.
In this set of experiments, researchers tested the notion by showing participants both a real-time speed video and the slowed down version. What they found was that “slow motion replay, compared with regular speed replay, produces systematic differences in judgments of intent”. And, said the researchers, showing both regular speed and slow motion video was “somewhat, albeit not completely, effective in reducing the impact of slow motion on first-degree murder convictions”. In other words, showing both versions does not entirely mitigate the bias introduced by the slow-motion version.
We have all had the experience of watching an action movie in which car crash or a fight is filmed in slow motion. We watch and think to ourselves “Oh no! Slow down!” before the crash. The slow motion allows us to imagine escape alternatives even while we see the events unfold before us. Showing a shooting in slow motion offers the same sense of “Don’t do it!” and alternatives to the bad outcome that don’t occur in real-time.
The researchers see this as a question of whether and under what conditions slow motion video should even be allowed in court and think it imperative that the benefits be weighed against potential costs. Their results, they say, support the idea that slow-motion videos appear to influence observers in a more punitive way than real-time speed. Showing both speeds—mitigates to a degree but not entirely (so the slow-motion still has a chilling effect on juror emotions and decisions).
From a litigation advocacy perspective, this is a reference you may want to tuck away to support a motion to not allow slow motion videos to be shown at a trial where you are defense counsel. Slow motion video is most likely to be found desirable by a prosecutor or plaintiff. While it will certainly be a strategy prosecutors will want to employ (to increase the sense of premeditation and intentionality in jurors observing the video), this article was designed to test the decisions made in the John Lewis appeal and could be found persuasive to judges.
Caruso, E., Burns, Z., & Converse, B. (2016). Slow motion increases perceived intent. Proceedings of the National Academy of Sciences DOI: 10.1073/pnas.1603865113
FALSE! Alas, even though Microsoft has popularized this notion of a shrinking attention span—it is simply not true. Or at least, there is no proof it is true. And the study the falsehood was based on was not even looking at attention span—it was looking at multi-tasking while browsing the web. To add insult to injury for the authors (who actually are academics), they do not even use the word goldfish in their article. Academics who’ve been misquoted or misinterpreted by the media are shaking their heads around the globe. This distorting of research by the popular press for the sake of sensational stories isn’t new, but for those who do the work, it is pretty disturbing. Reporters often do little back-checking with the geeks that make the world go ‘round, because it’s hard, and it often takes the edge out of a catchy story. Once the first misinterpretation is published, the skewed reports drift farther and farther from the research they purportedly rely on. Alas…
Okay. So what happened here? Microsoft apparently commissioned a 2015 non-peer-reviewed study to examine how internet browsing had changed over time—that is, how long do surfers look at a page prior to moving on? Then it was misinterpreted (really misinterpreted) with spurious comparison information added about how adult attention spans were shrinking—an assertion unsupported and unaddressed even by the Microsoft study. This misinformation was picked up by the New York Times and Time Magazine as well as numerous other mainstream media sites. Each site represented the data as a scientific truth stemming from a paper commissioned by Microsoft. The only problem was, it wasn’t true.
The table following is another example of how the work was misinterpreted—it misrepresents the human (and goldfish) attention span as the real focus of the paper, which could barely be farther from the truth. The last half of the below table (Internet Browsing Statistics) is actually taken from the article Microsoft commissioned to look at how browsing patterns on the internet have changed over time. The top half however (Attention Span Statistics) is not and is totally unrelated to the study they commissioned. And, none of it has been validated or otherwise proved to mean anything at all.
(If you have trouble reading this table, here is the original source.)
You can find the text of the complete article commissioned by Microsoft here. Open it as a pdf file and search it for “goldfish”. You won’t find it. Nada. The study was not designed to look at the human attention span nor was it designed to compare human attention spans to that of a goldfish. It was designed to look at how advances in web technology had changed how we surf the web. Because, Microsoft wants to figure out how to make the most out of web surfing.
We are fortunate to have fact-checkers on the web — particularly when it comes to topics like data visualization. PolicyViz does a thorough job of debunking this myth as does a writer posting on LinkedIn. They both want everyone to STOP comparing people to goldfish! We would concur. We would also love to see people using their common sense and questioning sensational claims–“the average attention span of a goldfish”? Really? Or, what is the significance of any of those memory lapse statistics? Has that always been the case? Is it different? Why should we care?
From a litigation advocacy perspective, there are two key lessons here: First, pay no attention to comparisons of your jurors to goldfish. Instead use things like chunking your information into 10 minute segments—that factoid is actually supported by research on learning and not just drummed up by a marketing representative. If jurors do not pay attention, it likely isn’t their declining attention spans, but rather that your presentation did not speak to their values, attitudes and beliefs. Test your presentations pretrial and make sure real people pay attention and understand.
And second, be very aware of how easily seduced people are by unproven, but juicy, factoids based on data that is unproven or false, just because it is amusing or it seems to support some preexisting but uninformed suspicion. Cleverness often sells.
We are big fans of how visual evidence can take very complicated ideas and make them easy to grasp by allowing those who are puzzled to “see” the complex big picture. Recently, we saw two really good examples of how to take complex issues and make them simple enough for the layperson to grasp. Both examples are from the political realm and both are based on easily fact-checked data. But the questions they answer come from a lot of data that would be very difficult to make sense of without these images.
1. Who chose these presidential candidates?
Here’s one from the New York Times that was posted on a infographic site (FlowingData.com/). As it happens, only 9% of Americans chose Hillary Clinton and Donald Trump as our two main political party candidates. For an interactive look at how you can “see” which 9% gave us these two candidates, visit the New York Times site for an interactive version of the graph. This is a very cool and very understandable way to allow viewers to understand what is a very complicated concept! As you read it, consider how it might apply to the ways your jurors need to grasp complex ideas.
2. Which politicians lie more?
PolitiFact is an independent fact-checking website. They pay attention to what politicians say and then tell us whether what they say is true to some degree, or sort of lying, lying, or must have their pants on fire (known to be demonstrably false). They illustrate political statements with their trademarked truth-o-meter graphic.
Enter Robert Mann whose work was featured over at DataViz (described as a site to get you excited about data–one graphic image at a time). Mann took a compilation of more than 50 statements made since 2007 by well-known politicians and (using the Politifact ratings) put all of the statements into a comprehensive chart to show us who lies more.
When some commenters questioned the accuracy of his graph and wondered just which Politifact analyses were used, a colleague of his (Jim Taylor) stepped up to say why Mann was accurate and to explain how to fact check the fact checkers–which, thanks to the internet, is a pretty straightforward thing to do these days. Additionally, Mann himself has issued an explanation for the data behind this graph which specifically explains how he chose whom to include and whether he “cherry-picked” statements (that would be a no).
From a litigation advocacy perspective, if you can get graphics that tell as much of a story as these visuals do, you go a long way toward juror persuasion. Our next post will be an example of bad data visualization, because while terrific examples like the ones in this post are more rare—there are plenty of losers out there.