Repository logo
 

Accuracy and Social Motivations Shape Judgements of (Mis)Information


Type

Thesis

Change log

Authors

Rathje, Steven 

Abstract

Why do people believe in and share misinformation? Some theories focus on social identity and politically motivated reasoning, arguing that people are motivated to believe and share identity-congruent news. Other theories suggest that belief in misinformation is not shaped by motivated reasoning, but is instead shaped by other factors, such as prior knowledge, lack of reflection, or inattention to accuracy.

Integrating multiple perspectives, this thesis argues that the spread of (mis)information is shaped by two (often competing) motivations: accuracy and social motivations, in combination with other factors, such as personality variables and information exposure. Through a variety of methods, including analyses of large-scale social media datasets, online experiments, network analysis, and a digital field experiment, this thesis illustrates how accuracy motivations, social motivations, and other variables shape the belief and spread of (mis)information.

Chapter 2 takes a big data approach to test whether online content that fulfills political identity motivations, such as out-group derogation and in-group favoritism, tends to receive more engagement online across eight large-scale datasets containing a total of 2.7 million tweets and Facebook posts. Chapter 3 experimentally manipulates accuracy and social motivations for believing in and sharing true and false news headlines in a series of four online experiments with 3,364 participants. Chapter 4 examines partisan asymmetries in the effectiveness of a popular misinformation intervention, the accuracy nudge. Chapter 5 links survey data to the Twitter data of 2,064 participants to examine how beliefs about the COVID-19 vaccine and politics are associated with following political elites online and interacting with low-quality news sources. Finally, Chapter 6 examines how manipulating participants’ online social networks in a naturalistic setting (e.g., incentivizing people to follow and unfollow specific accounts on Twitter in a randomized controlled trial) influences beliefs about the opposing political party and the sharing of misinformation.

Description

Date

2022-08-28

Advisors

van der Linden, Sander

Keywords

Misinformation, Polarization, Social Media, Intergroup Conflict

Qualification

Doctor of Philosophy (PhD)

Awarding Institution

University of Cambridge
Sponsorship
Gates Cambridge Scholarship (OPP1144)