Is Facebook Bad for Democracy?

We are living in an era of extreme partisanship. As documented by the Pew Research Center, majorities of people in both parties now express “very unfavorable” views of the other side, with most concluding that the policies of the opposition “are so misguided that they threaten the nation’s well-being.” 79 percent of Republicans approve of Trump’s performance as president, while 79 percent of Democrats disapprove. In many respects, party affiliation has become the lens through which we see the world; even the Super Bowl can’t escape the stink of politics.

There are two ways of understanding these divisions.

The first is look at the historical parallels. Partisanship, after all, is as American as apple pie and SUVs. George Washington, in his farewell address, warned that the rise of political parties might lead to a form of “alternate domination,” as the parties would gradually “incline the minds of men to seek security... in the absolute power of an individual.” In the election of 1800, his prophesy almost came true, as several states were preparing to summon their militias if Jefferson lost. Our democracy has always been a contact sport.

But there’s another way of explaining the political splintering of the 21st century. Instead of seeing our current divide as a continuation of old historical trends, this version focuses on the impact of new social media. Donald Trump is not the latest face of our factional republic—he’s the first political figure to fully take advantage of these new information technologies.

Needless to say, this second hypothesis is far more depressing. We know our democracy can handle partisan passions. It’s less clear it can survive Facebook.

Why might technology be cratering our public discourse? To answer this question, a new paper in PLOS ONE by a team of Italian researchers at the IMT School for Advanced Studies Lucca and Brian Uzzi at Northwestern looked at 12 million users of Facebook and YouTube. They began by identifying 413 different Facebook pages that could be sorted into one of two categories: Conspiracy or Science. Conspiracy pages were those that featured, in the delicate wording of the scientists, “alternative information sources and myth narratives—pages which disseminate controversial information, usually lacking supporting evidence and most often contradictory of the official news.” (Examples include Infowars, the Fluoride Action Network and the ironically named I Fucking Love Truth.) Science pages, meanwhile, were defined as those having “the main mission of diffusing scientific knowledge.” (Examples include Nature, Astronomy Magazine and Eureka Alerts.)

The researchers then looked at how users interacted with videos appearing on these sites on both Facebook and YouTube. They looked at comments, shares and likes between January 2010 and December 2014. As you can probably guess, many users began the study only watching videos from either the Conspiracy or Science categories. (These people are analogous to voters with entrenched party affiliations.) The researchers, however, were most interested in those users who interacted with both categories; these folks liked Neil deGrasse Tyson and Alex Jones. Think of them as analogous to registered Democrats who voted for Trump, or Republicans who might vote for a Democratic congressperson in the 2018 midterms.

Here’s where things get unsettling. After just fifty interactions on YouTube and Facebook, most of these “independents” started watching videos exclusively from one side. Their diversity of opinions gave way to uniformity, their quirkiness subsumed by polarization. The filter bubble won. And it won fast.

Why does the online world encourage polarization? The scientists focus on two frequently cited forces. The most powerful force is confirmation bias, that tendency to seek out information that confirms our pre-existing beliefs. It’s much more fun to learn about why we’re right (Fluoride = cancer) than consider the possibility we might be wrong (Fluoride is a safe and easy way to prevent tooth decay). Entire media empires have been built on this depressing insight.

The second force driving online polarization is the echo chamber effect. Most online platforms (such as the Facebook News Feed) are controlled by algorithms designed to give us a steady drip of content we want to see. That’s a benign aspiration, but what it often means in practice is that the software filters out dissent and dissonance. If you liked an Infowars video about the evils of vaccines, then Facebook thinks you might also like their videos about fluoride. (This helps explain why previous research has found that more active Facebook users tend to get their information from a smaller number of news sources.) “Inside an echo chamber, the thing that makes people’s thinking evolve is the even more extreme point of view,” Uzzi said in a recent interview with Anne Ford. “So you become even more left-wing or even more right-wing.” The end result is an ironic affliction: we are more certain than ever, but we understand less about the world.

This finding jives nicely with another new paper that directly tested the impact of filtered newsfeeds. In a clever lab experiment, Ivan Dylko and colleagues showed that feeds similar to those on Facebook led people to spend far less time reading articles that contradicted their  political beliefs. Dylko et al. end on a somber note: “Taken together, these findings show that customizability technology can undermine important foundations of deliberative democracy. If this technology becomes even more popular, we can expect these detrimental effects to increase.”

The obvious solution to these problems is to engage in more debunking. If people are seeking out fake news and false conspiracies, then we should confront them with real facts. (This is what Facebook is trying to do, as they now include links to debunked articles in News Feeds.) Alas, the evidence suggests that this strategy might backfire. A previous paper by several of the Italian scientists found that Facebook users prone to conspiracy thinking react to contradictory information by “increasing their engagement within the conspiracy echo chamber.” In other words, when people are told they’re wrong, they don’t revise their beliefs. They just work harder to prove themselves right. It’s cognitive dissonance all the way down.

It was only a few generations ago that most Americans got their news from a few old white men on television. We could choose between Walter Cronkite (CBS), John Chancellor (NBC) and Harry Reasoner (ABC). It was easy to assume that Americans wanted this shared public discourse, or at least a fact-checked voice of authority, which is why nearly 30 million people watched Cronkite every night.* But now it’s clear that we only watched these shows because we had no choice—their appeal depended on the monopoly of network television. Once this monopoly disappeared, and technology gave us the ability to curate our own news, we flocked to what we really wanted: a platform catering to our biases and beliefs.

Tell me I’m right, but call it the truth.

Bessi, Alessandro, Fabiana Zollo, Michela Del Vicario, Michelangelo Puliga, Antonio Scala, Guido Caldarelli, Brian Uzzi, and Walter Quattrociocchi. "Users Polarization on Facebook and Youtube." PLOS ONE 11, no. 8 (2016): e0159641.

*The shared public discourse reduced political partisanship. In the 1950s, the American Political Association published a report fretting about the lack of ideological distinction between the two parties. The lack of overt partisanship, they said, might be undermining voter participation.