This website provides an overview of a 5-year NSF-funded research project examining the flow of rumors and misperceptions online. It includes information about the project, the research team, publications, and will eventually also include relevant datasets. There is also a Twitter feed associated with the project, which aims to track relevant news and research (#FalseBeliefNews).
My collaboration with Erik Nisbet and Kathryn Cooper has been published. In the article we tackle the common misperception that conservatives are uniquely resistant to scientifically accurate information. We use an online experiment to demonstrate that both liberals and conservatives tend to be skeptical of science claims that don’t sit well with their politics, and that reading such claims reduces their trust in the scientific community more generally.
Rachel Neo’s paper, “Examining the Influence of SNS Network Homogeneity on Actual Voting Behavior Via Affective Responses toward In and Out-Group Presidential Candidates As Intervening Variables” was named one of four Top Student Papers in the Political Communication Division of the National Communication Association. Although not related to misperceptions, the paper relied on data collected by this project.
The APSA Political Communication section has awarded Brian Weeks’ paper “Feeling is Believing? The Influence of Emotions on Citizens’ False Political Beliefs” the Timothy Cook Best Graduate Student Paper Award for the 2013 APSA. Brian, who recently defended his PhD, has been a long-time member of this research team, and his paper is part of a larger research program examining the role of discrete emotions on the processing of political misperceptions.
Emotions shape people’s beliefs. People who are angry are more likely to be biased, which can make them more susceptible to misperceptions. A recently published study coming out of this project suggests that partisan media are contributing to rising levels of hostility between citizen who disagree politically. The study used survey data collected in the US and Israel to show that more frequent exposure to politically slanted news sites lead to stronger dislike toward supports of the opposed party.
Full article: dx.doi.org/10.1111/hcre.12028
Brian Weeks and I have had a paper accepted at the International Journal of Public Opinion Research that examines how political misperceptions can influence vote choice. We argue that rumor isn’t idle talk, but has real-world consequences. Email one of us if you’d like to see a prepress copy of the manuscript.
Update: The paper is now available online and in print: dx.doi.org/10.1093/ijpor/edu005
Brian Weeks will present his paper, “Feeling is believing? The influence of emotions on citizens’ false political beliefs” at APSA in Chicago later this week. Here’s the abstract:
Many Americans hold inaccurate beliefs about political candidates and issues, which are troublesome because evidence suggests people often behave in accordance with those misperceptions. Scholars have recently begun to explore why citizens are misinformed and most extant research uses partisan-based explanations to examine this phenomenon. In the current study, I argue that using party affiliations as the primary explanatory factor is limited in helping to understand why or how citizens are misinformed and instead make the case for discrete emotions as the mechanism driving false beliefs about politics. Using panel survey data (N = 1,004) collected over three waves during the 2012 presidential election I show that angry citizens are more likely to hold false beliefs about Barack Obama and Mitt Romney. The data also reveal that partisanship ceases to explain misperceptions about the candidates once the emotions anger and anxiety are accounted for. The findings indicate that false beliefs are driven in part by anger, suggesting future research on misperceptions must account for citizens’ feelings toward political targets.
A new conference paper, on the subject of affective polarization, is now available. In the U.S., affective polarization refers to the increasing hostility felt by both Democrats and Republicans toward those who belong to the opposing party. Although not directly related to misperceptions, this phenomenon has important implications for Americans’ willingness to listen to and engage with the other side, activities which can promote better understanding and more accurate beliefs. This paper examines partisan online news media’s polarizing effects, considering the impact of both pro-party and counter-party exposure.
You can see the slides here:
Our paper exploring the persistence of misperceptions in the face of carefully documented corrections is now accessible via the Journal of Communication’s website.
Garrett, R. K., Nisbet, E. C., & Lynch, E. K. (2013). Undermining the corrective effects of media-based political fact checking? The role of contextual cues and naïve theory. Journal of Communication. doi: 10.1111/jcom.12038
Abstract: Media-based fact checking contributes to more accurate political knowledge, but its corrective effects are limited. We argue that biographical information included in a corrective message, which is often unrelated to the inaccurate claim itself, can activate misperception-congruent naïve theories, increasing confidence in a misperception’s plausibility and inducing skepticism toward denials. Resistance to corrections occurs regardless of initial belief accuracy, but the effect is strongest among those who find the contextual information objectionable or threatening. We test these claims using an online survey-embedded experiment (N=750) conducted in the wake of the controversy over the proposed Islamic cultural center in NYC near the site of the 9/11 attacks, and find support for our predictions. Theoretical and practical implications are discussed.
NOTE: I fixed the SlideShare links.
I gave two talks at CSCW 2013 in San Antonio this week. The first was part of panel organized by Paul Resnick, and also included Travis Kriplean (creator of the Living Voter Guide), Sean Munson (creator of Balancer), and Talia Stroud (author of Niche News). The second talk was the paper coauthored with Brian Weeks that’s been in the news recently.
Abstract and links to slides follow.
Abstract: Bursting your (filter) bubble
Broadcast media are declining in their power to decide which issues and viewpoints will reach large audiences. But new information filters are appearing, in the guise of recommender systems, aggregators, search engines, feed ranking algorithms, and the sites we bookmark and the people and organizations we choose to follow on Twitter. Sometimes we explicitly choose our filters; some we hardly even notice. Critics worry that, collectively, these filters will isolate people in information bubbles only partly of their own choosing, and that the inaccurate beliefs they form as a result may be difficult to correct. But should we really be worried, and, if so, what can we do about it? Our panelists will review what scholars know about selectivity of exposure preferences and actual exposure and what we in the CSCW field can do to develop and test ways of promoting diverse exposure, openness to the diversity we actually encounter, and deliberative discussion.
Abstract: The promise and peril of real-time corrections to political misperceptions
Computer scientists have responded to the high prevalence of inaccurate political information online by creating systems that identify and flag false claims. Warning users of inaccurate information as it is displayed has obvious appeal, but it also poses risk. Compared to post-exposure corrections, real-time corrections may cause users to be more resistant to factual information. This paper presents an experiment comparing the effects of real-time corrections to corrections that are presented after a short distractor task. Although real-time corrections are modestly more effective than delayed corrections overall, closer inspection reveals that this is only true among individuals predisposed to reject the false claim. In contrast, individuals whose attitudes are supported by the inaccurate information distrust the source more when corrections are presented in real time, yielding beliefs comparable to those never exposed to a correction. We find no evidence of real-time corrections encouraging counterargument. Strategies for reducing these biases are discussed.
And you can still download the full paper here.