Media/Information/Technology

Demand for Deceit: Why Do People Consume and Share Disinformation?

By Dean Jackson

There is a market for information. Some people—journalists, researchers, anyone with a keyboard—produce information. Others are willing to exchange something of value for it, even if that something is mere time and attention. This applies to information that is true and therefore useful for the purposes of making decisions informed by accurate judgements; it also applies to information that is not.

Following a string of dramatic instances in which disinformation was spread over social media to purposefully manipulate public perception and political events, the world is more closely scrutinizing disinformation’s role in these markets. Most of this scrutiny has fallen on the “supply” side: who produces disinformation, for what purpose, and how do they disseminate it?

But every transaction has a seller and a buyer. A new paper from the International Forum for Democratic Studies examines “demand” for disinformation: why do people consume, believe, and share demonstrably false information?

The authors write that the answer “is tied to the psychology of information consumption and opinion formation.” They divide the ways individuals engage with disinformation into “passive” and “active” drivers embedded in human psychology.

Passive drivers work without conscious effort on the part of the individual; for instance, they may cause falsehoods to stick merely because they are oft-repeated, and individuals are more likely to share information without verifying it when they are experiencing heightened emotions like anger or disgust. Active drivers, on the other hand, affect the ways in which individuals reason toward a conclusion (often toward one which is compatible with their preexisting attitudes).

Despite concerns about emergent “post-truth” politics, some argue that news consumers do care about the accuracy of the information they read and share online—just perhaps not as much as they care about other motives, such as the desire to share a partisan, emotional, or moral signal with others online.

This distinction somewhat mirrors academic debates about individual motivations for reading and sharing misleading information online. Despite concerns about emergent “post-truth” politics, some argue that news consumers do care about the accuracy of the information they read and share online—just perhaps not as much as they care about other motives, such as the desire to share a partisan, emotional, or moral signal with others online. In this framing, the desire for accuracy is merely one of many competing incentives to read and share information.

Others contend that individuals more commonly believe misinformation due to “lazy” information processing, “especially in the context of social media, where news items are often skimmed or merely glanced at.” In this framing, most people passively absorb misleading information with too little incentive to critically analyze it.

But what about the effect of algorithms—does social media also encourage users to reach comfortable conclusions by steering them toward information that conforms to their worldview? Evidence suggests that while users with such heavily lopsided information diets are in the minority, they tend to be voracious news consumers who are strongly motivated to read, believe, and share “pro-attitudinal” content that affirms their preexisting worldview. As a result, they come to reside in echo chambers that are “deep, but narrow.”

Whether social media users are motivated partisans or merely shallow processors, the demand side of disinformation suggests limits for fact-checking and media literacy. Though they are adapting, fact-checkers still struggle to keep pace with disinformation, penetrate echo chambers, or alter the motives of news consumers. Media literacy may boost the accuracy motive, but motivated partisans are in some cases already quite media literate and therefore better equipped to come up with reasons to reject information that counters their preferred narrative.

If platform design rewards emotional resonance over accuracy, perhaps different design choices can encourage more critical consumption and sharing of content in a more prudent manner.

More research is clearly needed to understand the motives and incentives today’s information space presents to consumers, and how those motives and incentives are shaped by the platforms they use (this is especially true outside of the United States and Europe). If platform design rewards emotional resonance over accuracy, perhaps different design choices can encourage more critical consumption and sharing of content in a more prudent manner. If a subset of users consistently reject information from high-quality news media in favor of sources that play to grievance and suspicion, more can be done to understand why, and how to more effectively reach those users with countervailing messages.

This train of thought collides most immediately with two unfortunate obstacles. First, changes to the incentive structure of social media platforms may infringe upon their profit motive: the most socially valuable distribution of attention may not lead to the most advertising revenue. Second, current technological trends suggest that the capacity for manipulation of public opinion is likely to grow as synthetic media, artificial intelligence, virtual reality, and other emerging technologies are brought to bear. If the most anti-democratic implications of these technologies are to be avoided, the integrity of the information space must be treated as a first order priority now.

 

This post is based upon “Demand for Deceit: How the Way We Think Drives Disinformation,” a January 2020 working paper published by the International Forum for Democratic Studies and written by Samuel Woolley and Katie Joseff.

Dean Jackson is a program officer at the National Endowment for Democracy’s International Forum for Democratic Studies, where he focuses on the intersection between democracy, disinformation, media, and technology. Follow him on Twitter @DWJ88.

The views expressed in this post represent the opinions and analysis of the author and do not necessarily reflect those of the National Endowment for Democracy or its staff.

 

Image credit: Zenza Flarini / Shutterstock.com