Archive for the ‘Communication’ Category
We’ve written about American attitudes toward interracial marriage a fair amount here and (at least once) questioned poll results suggesting dramatic improvement in attitudes toward interracial marriage among Americans (an 87% approval rating?!). While interracial relationships may be more acceptable to many more Americans, there is also the recent report of an attack on an interracial couple in Washington State. Additional reports about the self-proclaimed white supremacist who stabbed the interracial couple without provocation said if he was released by the police he would attend the Trump rally and “stomp out more of the Black Lives Matter group”.)
Recently, we found an article that reflects some of what we think about the state of race relations and attitudes toward interracial marriages. And, as if in response to the event linked to above (which had not yet happened at the time the article was published), here is how the authors close their paper (after reporting that interracial couples were dehumanized relative to same race couples):
“These findings are meaningful given the negative consequences associated with dehumanization, most notably, antisocial behaviors such as aggression and perpetration of violence”.
The researchers say that they skeptically question the increased approval poll numbers when it comes to comfort with interracial marriage. They also express a general belief that if the poll questions used subtler measures about racial attitudes (rather than asking explicitly how approving the respondent was of interracial marriage)—the results would reflect significantly lower levels of approval for interracial marriage.
They refer to, as an example of attitudes toward interracial marriage, a 2013 Washington Post column by Richard Cohen saying that the interracial family of New York mayor Bill de Blasio must result in a “gag reflex” among conservatives.
The researchers conducted three separate studies (all with undergraduate student participants). We mention the participant pool for two reasons—one, because undergraduate students are perhaps a bit different from jury-eligible citizens, and two, because the Millennial generation is seen as most accepting of interracial marriages (according to Pew Research, Fusion’s Massive Millennial Poll, and CNN) although PBS, Politico and the Washington Post question whether that really means Millennials are overall more racially tolerant. It would seem to us that, if Millennials show evidence of implicit bias against interracial marriage, older generations would likely show even more.
And sure enough, Millennials (the undergraduate participants) did show bias against interracial couples. The implicit measures showed reactions of disgust as well as a tendency to dehumanize the interracial couples compared to same race couples.
The researchers hypothesize there is still a tremendous amount of emotional and under-the-surface bias (aka implicit bias) against interracial couples and, they say, emotional bias (aka disgust) is more predictive of discriminatory behavior than are racially based stereotypes.
The researchers also describe what happens when we dehumanize others—as the participants in these experiments dehumanized the interracial couples. We do fewer nice things and increase our “antisocial behavior” toward dehumanized others. There is less empathy, and more avoidant behavior. We are less likely to help and more likely to use aggression and perpetrate violence against dehumanized targets. We are more accepting of police violence against a black suspect and more accepting of violence against black people in general. We see the dehumanized targets as less evolved and civilized. These statements represent past research findings summarized in the article by the researchers.
The researchers also say that their results indicate the individuals in the interracial couples would likely not be dehumanized if evaluated separately, but there was something about the interracial pairing that elicited both the emotional and dehumanizing responses.
From a litigation advocacy perspective, this is very disturbing and certainly brings to mind our work on when to talk and when to stay quiet about racial bias in court. We are not living in a post-racial society, and basing your case strategy on such a rosy assumption is likely to be hazardous to your client. When race is absent from the relevant facts— but not from extra-evidentiary optics—think carefully about how to proceed. Remember that when the case facts are not salient to the fact your client is in an interracial relationship—that is when the bias is most likely to emerge. It’s a tricky and frustrating situation.
Skinner, A., & Hudac, C. (2017). “Yuck, you disgust me!” Affective bias against interracial couples. Journal of Experimental Social Psychology, 68, 68-77 DOI: 10.1016/j.jesp.2016.05.008
It’s time again for a combination post of things that didn’t make the cut for a full post but that we thought interesting (or odd) enough to want to share with you. We hope you enjoy this latest collection of factoids that will make you memorable when (and if) you re-share them.
Hot, hot, hot: And it isn’t a good thing for good behavior
We’ve written about the negative impact of hot, hot, hot weather before and here’s another story supporting the idea that there is a link between summer heat, bad moods, and poor self-control. When, according to a new study published in the journal Environmental Research, people report they lack energy or feel tired during the heat of the day, they were also more likely to report being stressed and angered. Lest you think this is a small scale study, the study looked at the reactions of 1.9 million Americans. The researchers think that, even if you live in a very warm climate, you are no better at adapting to it than those living in a cooler climate. (This is bad news for those in the southwest.)
However, it looks as though simply looking at pictures of cold weather can help you to improve your self-control. All you need to do is look at cold photos and imagine yourself being there—it will improve your self-control (which is good news for those in the southwest since we sure don’t want to live “there”). Perhaps hot and muggy locales need to post large billboards of icy landscapes and encourage viewers to think about what it would be like to be there rather than in the heat. Hmmm.
And as a helpful aside, the summer of 2016 has been, according to the NASA Earth Observatory, the hottest on record in 136 years! That’s hot! If you’d like to see the graphic illustrating this post in an animated gif form that covers 35 years, look here.
Will you learn more in a physics lecture if your instructor is attractive to you?
Apparently so. This is a research paper that attempted to test information from the popular website RateMyProfessor.com/ which apparently now asks students to “rate the hotness” of their instructor. (As though the tenure process was not difficult enough—now you have to suffer the indignity of how “hot” your students think you may be? Wow.) According to research published in The Journal of General Psychology, physics students who thought their instructor was attractive actually learned more as measured on quizzes following the lectures. The difference was “small but significant”. While you can read the full text of the article here, it was summarized accurately by Christian Jarrett over at BPS Research Digest.
Are pot smokers increasing or are people just responding more honestly to survey questions?
It’s hard to say but Gallup tells us that 13% reported being current marijuana users in an August 2016 survey—and that number is up from just 7% in 2013. The more often you attend church services, the less likely you are to report using marijuana. Further, one in five adults under the age of 30 report current use—and this is at least “double the rate seen among each older age group”. Gallup points out that nine different states are voting on marijuana legalization this fall and legalities could significantly shift. Perhaps Gallup should speak to the Drug Enforcement Association who recently announced marijuana would stay a Schedule 1 drug (like heroin and other drugs with “no medicinal value”).
How often do you check your smartphone?
You will have trouble believing this one! According to a recent survey, the average American checks their smartphone between 150 times a day and in the UK, it’s even higher! . We’ve written a lot here about smartphones and our increasing use and dependence on them—as well as the distractions caused by them while walking, working, and serving on juries. Time Magazine recently published an article on smartphone addiction that is worth reading—it’s eye-opening (which is the first time many of us grab our smartphones—even before we get out of bed).
Who owns your tattoo? The answer is apparently not entirely obvious
A recent article in The Conversation, tells us that while more than 20% of Americans have at least one tattoo (and 40% of Millennials)—your own tattoo could be violating either (or both) copyright and trademark rights and tattoo-related lawsuits are not uncommon. If you have or plan to have a tattoo—you likely want to read this one!
Identifying liberals and conservatives in voir dire (a shortcut when time is tight?)
This is a ridiculous study out of the UK which concludes that the taller one is, the more likely they are conservative. We do not recommend using this in voir dire, but here are a few author quotes:
“If you take two people with nearly identical characteristics – except one is taller than the other – on average the taller person will be more politically conservative,” said Sara Watson, co-author of the study and assistant professor of political science at The Ohio State University.”
How big were these differences? “The researchers found that a one-inch increase in height increased support for the Conservative Party by 0.6 percent and the likelihood of voting for the party by 0.5 percent.”
And there were gender differences—although they were not statistically significant! “The authors discovered that the link between height and political views occurred in both men and women, but was roughly twice as strong for men.”
The article itself was published in the British Journal of Political Science but there seems to be a version of the paper here. We will not use this one as our eyesight is not good enough to tell a 0.6% difference in height when potential jurors are seated.
Noelke, C., McGovern, M., Corsi, D., Jimenez, M., Stern, A., Wing, I., & Berkman, L. (2016). Increasing ambient temperature reduces emotional well-being Environmental Research, 151, 124-129 DOI: 10.1016/j.envres.2016.06.045
We’ve written about CRISPR (aka gene editing) before and even about concerns of Americans about use of emerging technologies, and while this post is sort of about CRISPR—it is also about visual evidence done right.
We often work on cases where jurors will need to understand very complex information. It may be a patent case or a complex business litigation case or something else that is technically daunting—but jurors often need to understand something very complicated. And often that something is very technologically advanced (and thus intimidating to the jurors).
It is almost always a very difficult process for the attorneys in a complex case (in which they have often been buried for years) to see through the many details of a complicated technology and tell a simple (yet accurate) story for jurors. We often test visual evidence in our pretrial research to see what resonates with jurors, what they remember, and what helps them to make sense of abstract and esoteric technology, processes, or patented ideas.
When we see terrific examples of visual evidence (culled from many different areas) we like to share them here to help you understand there really is a way to take very, very complex facts and details and make them accessible to those who have no experience whatsoever in the area and may be very intimidated by even attempting to understand the information.
Here is just such a video tutorial. This video uses cartoon images and plain language to explain the gene editing technique referred to as CRISPR. While the last parts of the video place it clearly in the pro-CRISPR camp, the first parts explain the technology clearly and succinctly. Because it is in a cartoon format (with which we are all familiar from childhood) it is non-threatening. Since it is visually presented, we are able to understand a tremendous amount of technical information without jargon or numbers that make less technical viewers’ eyes glaze over.
If CRISPR can be explained in a few minutes of cartoons, you can explain anything in ways the most naïve juror can understand. All you need is a fabulous visual evidence consultant. We happen to know a few of them!
Last week we published a new post on terrific visual evidence from the political arena that quickly and visually described complex (and huge) data sets. This week (and no, this will not be a regular weekly feature) we mine another (and perhaps unexpected) data source: Twitter. While you may have seen Jimmy Kimmel having movie stars read mean tweets about themselves—this visualization of American intolerance is much less amusing.
Although we are based in Texas (claiming the #3 slot in terms of American derogatory tweets), we travel all over the country doing pretrial research. As we prepare for that travel, we always investigate the demographic data of a planned venue so we can recruit a sample of mock jurors that reflect the community in which we are doing research. We look at online versions of local newspapers to get a sense of what residents have been reading about, talking about, or experiencing first-hand that might have relevance to our case. And sometimes, depending on the case specifics, we do a lot more research than usual to identify themes we might hear from mock jurors that can be instructive for us as we write up reports. However, prior to seeing the information we are about to share with you, we have not done Twitter research. From now on, we might need to add that to the list.
Imagine that you are working on a case that involves litigants that are minorities, or issues about social justice, or immigrants. You want to have a sense of the biases present in the area—where you can find representative opinions about your clients or the other party. You might want to understand the prevalence of biases and their social acceptability within the venue. Are slurs against black people, slurs against Hispanics or Latinos, slurs against women, slurs against the cognitively disabled, slurs against those who are overweight, or just plain nastiness and incivility in general something that people feel able to express openly? How can you know? You guessed it. Twitter.
The results can be found on Twitter, courtesy of the geotagging function in Twitter which allows you to “map” hate speech. The mining was done by a residential search engine site (Adobo) so that people looking for a new place to live can either steer clear of localities or get a higher chance of having a next-door neighbor who shares a particular favorite form of bigotry. It could also, though, be a simple place to take a look at how biases (aka online hate speech) will interact with your specific case facts.
Some of the maps do not fit with our stereotypes of various states—take for example the least bigoted states: Wyoming, Montana, Vermont, South Dakota, Idaho, Arkansas, Minnesota, Maine, North Dakota, and Wisconsin. Some of these states are often seen as having many people in them who are very bigoted and yet, there is no mention of them on Adobo’s data visualization maps as being exceptionally bigoted states. And therein lies the issue. Some areas of the country, especially rural areas, don’t tweet nearly as much as others.
Twitter participation is relatively rare among adults on the internet (only about 23% of all online adults according to Pew Research Center). So these maps geotagging hate speech may not be representative of all your neighbors (we’re talking about the 77% of online adults who are not on Twitter). When we evaluate barriers to evidence-based jury decision-making, understanding the prevalent social attitudes is crucial. So, often these Twitter maps are not accurate enough by themselves—but they are a small window into potentially big problems. And that was the goal for Adobo in publishing the results. Go see the rest for yourself. It’s eye-opening and disheartening with only a few brighter spots.
FALSE! Alas, even though Microsoft has popularized this notion of a shrinking attention span—it is simply not true. Or at least, there is no proof it is true. And the study the falsehood was based on was not even looking at attention span—it was looking at multi-tasking while browsing the web. To add insult to injury for the authors (who actually are academics), they do not even use the word goldfish in their article. Academics who’ve been misquoted or misinterpreted by the media are shaking their heads around the globe. This distorting of research by the popular press for the sake of sensational stories isn’t new, but for those who do the work, it is pretty disturbing. Reporters often do little back-checking with the geeks that make the world go ‘round, because it’s hard, and it often takes the edge out of a catchy story. Once the first misinterpretation is published, the skewed reports drift farther and farther from the research they purportedly rely on. Alas…
Okay. So what happened here? Microsoft apparently commissioned a 2015 non-peer-reviewed study to examine how internet browsing had changed over time—that is, how long do surfers look at a page prior to moving on? Then it was misinterpreted (really misinterpreted) with spurious comparison information added about how adult attention spans were shrinking—an assertion unsupported and unaddressed even by the Microsoft study. This misinformation was picked up by the New York Times and Time Magazine as well as numerous other mainstream media sites. Each site represented the data as a scientific truth stemming from a paper commissioned by Microsoft. The only problem was, it wasn’t true.
The table following is another example of how the work was misinterpreted—it misrepresents the human (and goldfish) attention span as the real focus of the paper, which could barely be farther from the truth. The last half of the below table (Internet Browsing Statistics) is actually taken from the article Microsoft commissioned to look at how browsing patterns on the internet have changed over time. The top half however (Attention Span Statistics) is not and is totally unrelated to the study they commissioned. And, none of it has been validated or otherwise proved to mean anything at all.
(If you have trouble reading this table, here is the original source.)
You can find the text of the complete article commissioned by Microsoft here. Open it as a pdf file and search it for “goldfish”. You won’t find it. Nada. The study was not designed to look at the human attention span nor was it designed to compare human attention spans to that of a goldfish. It was designed to look at how advances in web technology had changed how we surf the web. Because, Microsoft wants to figure out how to make the most out of web surfing.
We are fortunate to have fact-checkers on the web — particularly when it comes to topics like data visualization. PolicyViz does a thorough job of debunking this myth as does a writer posting on LinkedIn. They both want everyone to STOP comparing people to goldfish! We would concur. We would also love to see people using their common sense and questioning sensational claims–“the average attention span of a goldfish”? Really? Or, what is the significance of any of those memory lapse statistics? Has that always been the case? Is it different? Why should we care?
From a litigation advocacy perspective, there are two key lessons here: First, pay no attention to comparisons of your jurors to goldfish. Instead use things like chunking your information into 10 minute segments—that factoid is actually supported by research on learning and not just drummed up by a marketing representative. If jurors do not pay attention, it likely isn’t their declining attention spans, but rather that your presentation did not speak to their values, attitudes and beliefs. Test your presentations pretrial and make sure real people pay attention and understand.
And second, be very aware of how easily seduced people are by unproven, but juicy, factoids based on data that is unproven or false, just because it is amusing or it seems to support some preexisting but uninformed suspicion. Cleverness often sells.