Why people share conspiracy theories even when they know they are wrong

Co-written by Bella Ren and Maurice Schweitzer

Social media often confronts users with difficult choices: to share unverified content that would generate social engagement, or to share content that they know is more likely to be true but less likely to be “liked”. In other words, the decision to share conspiracy theories for many people reflects a calculated compromise.

The spread of conspiracy theories has significantly limited our ability to deal with crises, from tackling climate change to tackling the COVID-19 pandemic. More than 70 million Americans eligible for a vaccine have chosen not to receive one, and about half of those people have mistaken beliefs, such as the belief that the government uses vaccine injections to insert microchips into people.

Beyond beliefs

A growing body of work has begun to advance our understanding of why people believe in conspiracy theories. This work found that people who feel they lack control over events and dislike uncertainty and ambiguity are more likely to believe in conspiracy theories.

However, many questions remain as to why people share conspiracy theories. Although early work assumed that people on social media were sharing content they believe in, our new research reveals that people are often willing to share conspiracy theories they know to be wrong. In fact, we found that 40 percent of the participants admitted that they would be willing to share conspiracy theories that they know to be wrong.

Why?

It turns out that when many people share information, they care about disseminating information that will strengthen their social engagement. We found that when social rewards (such as the number of “likes” people received) were high, many people were willing to share conspiracy theories that they knew to be wrong.

We share misinformation for social connections

Conspiracy theories, compared to factual news, trigger higher emotional arousal. This makes conspiracy theories particularly attractive for generating social engagement. Consider the following example: a choice between the replay of two messages. An article reports that Princess Diana died in a car accident. The other article reports that the royal family secretly plotted to assassinate Princess Diana and her lover. A message is more likely to be true. The other is much more engaging, more likely to attract attention, and more likely to trigger a reaction.

In one of our experiments, we varied the patterns of content shared by people. In one condition, we paid attendees a bonus for sharing information that would generate more “likes”. In another, we encouraged posts that would generate more comments. In a third, we paid participants a premium for sharing accurate information.

What content have people shared?

When prompted to share content to generate social engagement (“likes” and comments), nearly 50% of attendees shared posts about why the moon landing was wrong, the UFO spacecraft in Area 51 or how COVID-19 is a biological weapon designed. However, when we rewarded people for sharing accurate information, almost all of them shared articles on the 2021 U.S. Capitol attack, global warming, and racially-motivated violence. From this study, we found that people can separate fact from fiction and expect fake news to generate more comments and more likes.

People readily share conspiracy theories

We have also found that people are willing to share fake news even without financial incentives. We have created a social media platform that simulates the way people interact in online social environments. On this platform, participants shared messages and received comments from others in several rounds. In our experiments, we modified the feedback they received.

Sharing conspiracy theory between conditions, before and after receiving social feedback

Source: Ren, Dimant, Schweitzer

When we gave people on the platform more positive social feedback (i.e. more “likes”) for sharing conspiracy theories, the percentage of people who shared conspiracy theories plot nearly doubled after just a few cycles of interaction. That is, we have found that people are incredibly sensitive to social feedback in their environment. As soon as they discovered that sharing misinformation generates more “likes,” they sacrificed accuracy in seeking comment and attention on social media. Of course, our experimental setting differs from real platforms in many ways. For example, the reputational consequences of sharing disinformation were weaker in our experimental setting. Nonetheless, our results reveal that the decision to share misinformation can be surprisingly sensitive to social incentives.

Taken together, our results identify social motivations as an important driver in the decision to share conspiracy theories. Anticipating greater attention and engagement, individuals may choose to share messages that they know to be false. Ultimately, by sharing the wrong information, people can change their beliefs and those of others. As a result, policymakers may be able to curb the spread of disinformation by simply changing incentives for social engagement by shutting down bots that ‘like’ and retweet disinformation and by encouraging official institutions to support the real news. .


Source link

Comments are closed.