Image
illustration of a group of people using social media on their phones
Photo Credit
iStock

When it comes to erosion of democracy, not all social media are equally harmful

The spread of conspiracy theories, anti-elite attitudes and other beliefs often deemed corrosive to democratic health is common across virtually all social media. But according to a recent paper co-authored by UC Santa Barbara political scientists, certain platforms make such fringe — and sometimes extreme — ideologies more influential.

The study, published in The International Journal of Press/Politicsoffers fresh insight on how social media platforms affect the impact of conspiracy theories and populist ideas — with key differences in how they connect like-minded users and strengthen certain kinds of thinking. 

“Blurred lines between fact and misinformation are worsening, and concerns are growing about threats to peaceful transitions of power in elections,” said co-author Julien Labarre, a researcher in the Department of Political Science. “American democracy is at a crossroads.”

As part of a group of researchers affiliated with UCSB’s Center for Information Technology and Society, Labarre wanted to know whether social media serve only to connect people who share beliefs in conspiracies and populist attitudes or if the platforms serve to strengthen those beliefs. Their methods included dividing social media into three groups. One group was composed of so-called alt-tech platforms popular among the alt-right and far-right, such as Truth Social, Rumble and Gab, where hate speech and falsehoods are generally tolerated. Researchers also grouped Facebook, Snapchat and Whatsapp, which are “relationship-oriented” in that they often connect people who likely already know one another. The final grouping was composed of “interest-oriented” apps X/Twitter, Instagram and Tiktok, which primarily focus on users’ shared interests.

Image
color photograph of julien labarre
Photo Credit
Courtesy
Julien Labarre

Among other key findings, the study showed that use of alt-tech social media appears to strengthen user’s false beliefs and orientation toward populist governance; and that people who are drawn to alt-tech apps because they find conspiracies and populist ideas appealing, are likely to become even more extreme in their beliefs. The study also showed that relationship-oriented apps are associated with stronger beliefs in conspiracies and populist ideas.

“Messages in relationship-oriented apps tend to come from known others,” Labarre said. “These messages from people you know, who are likely to share some of your values, are more influential than messages from strangers, even strangers you agree with.”

Image
cover image of political science journal

The interest-oriented apps are different, according to the study, which found that these users do not appear any more conspiratorial or populist than they otherwise would be based on ideology, age, education and personality traits. In other words, Labarre explained, people who believe a conspiracy theory may exchange messages with one another on TikTok, but that does not necessarily strengthen their beliefs. 

Overall, he said, users on relationship-oriented platforms often form tight-knit, trusted groups that can amplify extremist views, creating a sort of echo chamber; these groups act as incubators where users feel comfortable sharing and reinforcing extreme viewpoints. On the other hand, people on X/Twitter, for example, tend to follow users whose views interest them but who largely remain strangers, and the absence of strong social ties generally reduces the power of such messages to change beliefs.

“We tend to think of alt-tech as a social media hotbed of extremism, and it is,” Labarre said. “But because of their much larger reach, relationship-oriented social media, like Facebook or WhatsApp, may have just as great an influence on recruiting people into false and anti-democratic beliefs.”

Interest-based apps may not necessarily strengthen beliefs, he explained, but the regular circulation of falsehoods and autocratic ideas can remind users of their beliefs and influence them as they make decisions, such as at the ballot box. “The conspiracy theory you are constantly reminded of on X/Twitter is more likely to affect your behavior than the one you rarely think about. And in the case of alt-tech and relationship-based apps, the situation is worse. The flow of content not only reminds you of your beliefs, but is likely to amplify them.” 

Related data from France, Poland and the U.K. revealed similar patterns, he said. “This gives us confidence that what we’ve found is robust — across countries, social networks of like-minded people who know one another and communicate online are part of the problem of democratic erosion.”

Media Contact

Keith Hamm

Social Sciences, Humanities & Fine Arts Writer



keithhamm@ucsb.edu

Share this article

FacebookTwitterShare