Archive for the ‘Internet & jurors’ Category
When facing a panel of prospective jurors for voir dire and jury selection it is important that you update your perceptions of who these people are in 2017. It is hard to keep up with change and to replace our outdated ideas of “how North America is” but here is some data to help you do just that. These facts are wonderful perspective changers and we hope some of them will surprise you (since that will help you remember and update your perceptions of those potential jurors).
“Normal America is not a small town of white people”
The people over at Nate Silver’s fivethirtyeight.com site did us an incredible service with this article first published in spring 2016. So—before you go look, when you think of “normal America”, what picture comes to mind? For those of you who think of a scene more consistent with 1950s America, this is a must-read. Things, times, our citizens and what is now “normative” has changed a lot since the 1950s. Here’s a look at the communities most like 1950s America and the communities most like the America of the present. The two sets of communities are incredibly different. It is a nod to why it is so very important to know the demographics of your venire but also an imperative to update that mental picture you have of “normal America”. We are so not in Kansas anymore, Toto.
Digital news and followup by race of online news consumer
So…when you think of who reads news online and who follows up on that news—would you think those who follow-up more likely to be Black or White? You don’t have to answer out loud,—just think to yourself and read on. Pew Research just published an article based on questioning more than 2000 online news consumers twice a day for a week.
As part of that questioning, Pew asked the news consumers if they took any of six pre-identified follow-up actions: speaking with someone either in person or over the phone; searching for additional information; posting, sharing or commenting on a social networking site; sending an article to someone by email or text message; bookmarking or saving the news for later; and commenting on a news organization’s webpage.
As a reminder, you are predicting whether Black or White online news consumers are more likely to do any of these six follow-up actions. Got your prediction? Here’s what Pew found:
Black online news consumers preformed at least on of these actions 66% of the time on average. For Whites it was 49%.
There are other fascinating differences by race in this recent report from Pew. You can read the entire (brief and succinct) summary here.
Who counts as Black anymore?
This is an opinion piece that mentions the Dark Girls and Light Girls documentaries and the difficulties both groups (Blacks with dark skins and Blacks with light skins) face in being Black in the current day. The author encourages us to stretch (and update) our perceptions of what constitutes race and Blackness. A worthwhile read from the website The Conversation.
How many US homes have televisions?
Here’s another shifting reality. In the not too recent past, most US homes had televisions and often multiple televisions. That is changing. Again, from the Business Insider: the number of homes that do not include a TV set has “at least doubled since 2009”. While the percentages are still low (2.6% of American homes now do not include a TV) they are growing quickly and are a reflection of people turning to computers and mobile devices to access media. Percentages of homes without televisions is expected to continue to increase as young people grow older and continue to use alternate screens for viewing programming.
Who reads newspapers anymore? Older or younger Americans?
Young Americans have been less likely to read newspapers than older Americans for some time. But, recently, Pew Research looked closely at newspapers with a more national focus (e.g., The New York Times, the Washington Post, Wall Street Journal, and USA Today). While readership of election news was roughly equal for USA Today, the other three (NYT, WaPo, WSJ) attracted more readers under 50 than over 50 when it came to election news coverage. This is different from the patterns for local newspaper which are read more by older readers. Pew concludes that digital outreach efforts are working for these national papers in attracting younger readership.
Just how common is crime by immigrants? (Not at all common.)
Despite ongoing political rhetoric about victims of crimes by immigrants, it is simply not a significant problem. The Business Insider summarizes the statistics this way:
According to a September 2016 study by Alex Nowrasteh at the Cato Institute, a libertarian think tank, some 3,024 Americans died from 1975 through 2015 due to foreign-born terrorism. That number includes the 9/11 terrorist attacks (2,983 people) and averages nearly 74 Americans per year.
Since 9/11, however, foreign-born terrorists have killed roughly one American per year. Just six Americans have died per year at the hands, guns, and bombs of Islamic terrorists (foreign and domestic).
According to Nowrasteh’s analysis, over the past 41 years (January 1975–December 2015), and including the 9/11 attacks:
The chance an American would be killed by a foreign-born refugee terrorist is 1 in 3.64 billion per year, based on the last 41 years of data.
The chance of an American being murdered by an undocumented immigrant terrorist is 1 in 10.9 billion per year.
The chance an American could be killed by a terrorist on a typical tourist visa was 1 in 3.9 million.
This article contains tables of numbers that are easy to read and point out the reality behind the rhetoric. The political rhetoric is about fear and not about reality. Read beyond the rhetoric to get to the facts.
How America changed during Barack Obama’s Presidency
If you have looked at any of these changes with some level of surprise, it would also prove useful to look at another Pew Research report examining changes in America during the eight years of the Obama presidency. This report covers attitudes important in voir dire and jury selection as they reflect values and beliefs relevant to case decision-making. So many changes have taken place in the past eight years that it is staggering to see them all summarized in this report. There are sure to be some changes (and corresponding shifts in attitude) that will be related to your own upcoming cases.
Here’s another combination post offering multiple tidbits for you to stay up-to-date on new research and publications that have emerged on things you need to know. We tend to publish these when we’ve read a whole lot more than we can blog about and want to make sure you don’t miss the information.
Juror questions during trial and the prevalence of electronic and social media research
The National Center on State Courts just published a study authored by a judge in the Pennsylvania Lawyer on whether allowing jurors to ask questions during trial will help resolve issues of electronic and social media research during trial. The judge-author suggests the judicial directives to not conduct any form of research (the instructions usually itemize various forms of social media as examples of “what not to do”) do not stop the research from happening—it simply makes the research surreptitious rather than public. Since this publication is in the Pennsylvania Lawyer, they focus on Pennsylvania jury instructions but also discuss how other venues have used (and controlled) juror questions during trial. The article offers suggestions developed in the subcommittee on civil jury instructions. It is well worth a read if you have questions about the practice of allowing juror questions.
We should question alibis and the weight we place on them during jury deliberations
Given all the concerns about the accuracy of eye-witness testimony, it only makes sense we should also closely examine alibis and whether we simply accept them as true. A new article in Pacific Standard magazine says we need to pay attention to alibis as new research is telling us that accuracy of alibis resemble the vagaries of faulty eye-witness testimony. According to the new research, we tend not to remember mundane events (like where we were on August 17, 2009). The authors of the study described say that the wrong people can end up in jail due to alibi inconsistency and eyewitness mis-identification.
The curious impact of donning a police uniform
New research published in Frontiers in Psychology tells us that putting on a police uniform automatically affects how we see others and creates a bias against those we consider of lower social status. Essentially, say the researchers, the uniform itself causes shifts (likely due to the authority communicated by the uniform) resulting in judgment of those considered to be lower status (i.e., in this study those wearing hoodies were identified as having a lower social status). The researchers think it possible that police officers (who put on their uniforms) may perceive threat where none exists.
Identifying lies with fMRI machines
We’ve written about identifying deception using fMRIs frequently at this blog and here’s a four-page “knowledge brief” from the MacArthur Foundation Research Network on Law and Neuroscience. You can also download this summary at SSRN. This is a terrific (and brief) summary on everything you need to know about what fMRI machines can tell us about deception and what they cannot tell us about deception. You could think of this as a primer on fMRIs and how they work (and don’t work) as well as a guide to deposition testimony of an expert witness touting the deception-identifying abilities of the machine. This resource is very worth your time.
Ciro Civile, & Sukhvinder S. Obhi (2017). Students Wearing Police Uniforms Exhibit Biased Attention toward Individuals Wearing Hoodies. Frontiers in Psychology, (February 6,)
We read so much for this blog (and just out of general curiosity) that we often find these small bits of information which don’t justify an entire blog post but that we want to share with you because they are just too good to ignore. Here’s another one of those combination posts that you simply must read!
Generational labels are so passé
We are so used to hearing generational labels (like Boomer, Gen Xer, Millennial) tossed about in marketing presentations and in casual conversation but Harvard Business Review thinks these labels are obsolete. These labels don’t add additional information and are increasingly used as a substitute for age ranges, says HBR. Further (they opine) the cut-off dates for generations are entirely arbitrary, and frankly, there is a fair bit of variability in what birth-years ranges are thought to apply to these labels. They suggest, rather than the generational name labels, we use age or even age ranges to describe groups of people.
They call this “old way” of using generational monikers “generational segmentation” and say it is an artifact of (way back when) when marketers could not easily do “individual level targeting”. It’s an interesting perspective that rings pretty true to our minds especially considering this recent post (and we’ve done a lot of generationally-themed writing). The most distorting aspect of the generational labels is that they are frozen in time—the members of each age cohort are often viewed as being alike in key ways, as if these characteristics don’t evolve as a person grows older. The “old way” combines generational identity (obviously, since a 30-year-old in 2017 is also a Millennial) with the idea that they are also a 30-year-old, period. That person will be 50 in 20 years, clearly a different stage of life, but they will still be a Millennial. How do we understand that? It is much more complicated. Just use actual age ranges, just like the cool kids at HBR.
Emoji’s and the pursuit of academic tenure
If you had considered this (although, in truth, who would?) you would have realized that the ever-more-popular emoji would be studied by academics in pursuit of tenure. And of course, that which was coming has now arrived. The researchers say that emojis (the modern version) and emoticons (the originals designed with punctuation symbols) have developed to communicate the appropriate facial expression to go with a string of text. The first reported use occurred in discussion forums in the 1980s (say the researchers) when this emoticon symbol 🙂 was included to communicate the message was meant in fun. Now, up to 92% of the online population uses emojis (the more modern version uses cartoonish emojis like this one 🤗). The researchers use easy to understand language (not) as they communicate the meaning behind emojis:
“They disambiguate the communicative intent behind messages, serve important verbal and nonverbal functions in communication, and can even provide insight into the users personality.”
“Drawing on the method of corpus linguistics, the bountiful occurrence of emojis in real-world online text provides a new means to examine the function of contemporary interactional communication and emotion portrayal.”
We don’t think we’ll be covering much of this work as it evolves but wanted you to be aware it is out there. Frankly, we think it is—how should we say? 💩
Persuasion landmines: When facts fail and your most salient points are the least informative
After more than 25 years, we still love doing pretrial research but it is still very common to see attorneys chewing peanut M&Ms in frustration while their important facts are dismissed (or ignored) by mock jurors. Here are two articles (both happened to be published in Scientific American just this month) to help you increase the likelihood your story will be heard and remembered accurately. The first article focuses on the reality that pre-existing beliefs will trump your facts when jurors listen to your narrative. The author summarizes the (frightening) research and then offers suggestions (six in all—most of which we’ve blogged about here before!) to try to convince your listeners to consider your information. It is well worth the 5-7 minutes it will take you to read.
The second article, uses the example of noting a person has purple hair to remember Amanda’s name (which means if Amanda changes her hair color, you will be stumped). The point of this article is to help us learn how to categorize important information accurately and not be side-tracked by red herrings like purple hair. The author talks to a researcher who says that if you want to overpower attention-getting facts (like purple hair), your counter-evidence needs to be eye-catching and quickly understandable. Let’s hear it again for the power of visual evidence!
Kaye LK, Malone SA, & Wall HJ (2016). Emojis: Insights, Affordances, and Possibilities for Psychological Science. Trends in Cognitive Sciences PMID: 28108281
You have probably been fooled a few times as well. Facebook friends post their scores on various silly quizzes and sometimes you go take that test as well. It’s just silly fun and means nothing, right?
Wrong. Apparently, Cambridge Analytica has been using Facebook quizzes to create “a tool to build psychological profiles that represent some 230 million Americans”. They sell this data for a price—but only to Republican candidates and our new President-elect benefitted from their insights on winning the 2016 electoral college vote.
So how are they doing it? We don’t know for sure but it appears they are combining publicly available information on you with your Facebook ‘likes’ and your responses on Facebook’s innocent little quizzes to predict how you will respond to various political messages. They have used a variation on the Big Five trait theory (called OCEAN) often used in psychology research to figure out who you are and what seems to motivate you. They claim to have 3,000 to 5,000 data points on each individual they profile.
Here is how Cambridge Analytica describes what they do:
“We use the established scientific OCEAN scale of personality traits to understand what people care about, why they behave the way they do, and what really drives their decision-making.”
Their website offers to let you take the OCEAN (which measures your Openness, Conscientiousness, Extraversion, Agreeableness and Neuroticism) test to see how you score. We would really not recommend that you do this…unless you want to just give them what tattered scraps remain of your privacy.
And here is how the New York Times describes Cambridge Analytica:
“A spinoff of a British consulting company and sometime-defense contractor known for its counterterrorism “psy ops” work in Afghanistan, the firm does so by seeding the social network with personality quizzes. Respondents — by now hundreds of thousands of us, mostly female and mostly young but enough male and older for the firm to make inferences about others with similar behaviors and demographics — get a free look at their Ocean scores. Cambridge Analytica also gets a look at their scores and, thanks to Facebook, gains access to their profiles and real names.” [snip…]
“In the age of Facebook, it has become far easier for campaigners or marketers to combine our online personas with our offline selves, a process that was once controversial but is now so commonplace that there’s a term for it, “onboarding.” Cambridge Analytica says it has as many as 3,000 to 5,000 data points on each of us, be it voting histories or full-spectrum demographics — age, income, debt, hobbies, criminal histories, purchase histories, religious leanings, health concerns, gun ownership, car ownership, homeownership — from consumer-data giants.”
You may be interested in knowing that Cambridge Analytica worked for the “Leave” side in the UKs Brexit campaign. The NYT article is frightening in the detail it offers on how individual Facebook users were targeted with different messages based on what would be most persuasive given their psychological profiles. The newspaper story refers to this process as “weaponizing Facebook” and this does not seem an exaggeration. Even more disturbing is the intimation that they don’t even need all those Facebook quizzes to know enough about you to do a psychological profile. Freedom of information laws in the US give them lots and lots and lots of private information about you.
So, next time your brother’s spouse’s sister-in-law posts the results from her quiz on introversion versus extraversion, think about how the information is going to be used to manipulate your decisions and even your private voting decisions and maybe, take a little swing at the data brokers—and just say No.
FALSE! Alas, even though Microsoft has popularized this notion of a shrinking attention span—it is simply not true. Or at least, there is no proof it is true. And the study the falsehood was based on was not even looking at attention span—it was looking at multi-tasking while browsing the web. To add insult to injury for the authors (who actually are academics), they do not even use the word goldfish in their article. Academics who’ve been misquoted or misinterpreted by the media are shaking their heads around the globe. This distorting of research by the popular press for the sake of sensational stories isn’t new, but for those who do the work, it is pretty disturbing. Reporters often do little back-checking with the geeks that make the world go ‘round, because it’s hard, and it often takes the edge out of a catchy story. Once the first misinterpretation is published, the skewed reports drift farther and farther from the research they purportedly rely on. Alas…
Okay. So what happened here? Microsoft apparently commissioned a 2015 non-peer-reviewed study to examine how internet browsing had changed over time—that is, how long do surfers look at a page prior to moving on? Then it was misinterpreted (really misinterpreted) with spurious comparison information added about how adult attention spans were shrinking—an assertion unsupported and unaddressed even by the Microsoft study. This misinformation was picked up by the New York Times and Time Magazine as well as numerous other mainstream media sites. Each site represented the data as a scientific truth stemming from a paper commissioned by Microsoft. The only problem was, it wasn’t true.
The table following is another example of how the work was misinterpreted—it misrepresents the human (and goldfish) attention span as the real focus of the paper, which could barely be farther from the truth. The last half of the below table (Internet Browsing Statistics) is actually taken from the article Microsoft commissioned to look at how browsing patterns on the internet have changed over time. The top half however (Attention Span Statistics) is not and is totally unrelated to the study they commissioned. And, none of it has been validated or otherwise proved to mean anything at all.
(If you have trouble reading this table, here is the original source.)
You can find the text of the complete article commissioned by Microsoft here. Open it as a pdf file and search it for “goldfish”. You won’t find it. Nada. The study was not designed to look at the human attention span nor was it designed to compare human attention spans to that of a goldfish. It was designed to look at how advances in web technology had changed how we surf the web. Because, Microsoft wants to figure out how to make the most out of web surfing.
We are fortunate to have fact-checkers on the web — particularly when it comes to topics like data visualization. PolicyViz does a thorough job of debunking this myth as does a writer posting on LinkedIn. They both want everyone to STOP comparing people to goldfish! We would concur. We would also love to see people using their common sense and questioning sensational claims–“the average attention span of a goldfish”? Really? Or, what is the significance of any of those memory lapse statistics? Has that always been the case? Is it different? Why should we care?
From a litigation advocacy perspective, there are two key lessons here: First, pay no attention to comparisons of your jurors to goldfish. Instead use things like chunking your information into 10 minute segments—that factoid is actually supported by research on learning and not just drummed up by a marketing representative. If jurors do not pay attention, it likely isn’t their declining attention spans, but rather that your presentation did not speak to their values, attitudes and beliefs. Test your presentations pretrial and make sure real people pay attention and understand.
And second, be very aware of how easily seduced people are by unproven, but juicy, factoids based on data that is unproven or false, just because it is amusing or it seems to support some preexisting but uninformed suspicion. Cleverness often sells.