Facebook Users Make Their Own News Bubbles

Subway passengers reading with modern gadgets as waiting a train.
(Image credit: Iakov Filimonov/Shutterstock)

Facebook users who are the most news-obsessed are also the most likely to interact with a small number of news sources, new research finds.

The study is a look at the architecture of social media polarization — essentially, how people are so effective at sorting themselves into opposing groups and filtering out alternative opinions. Though Facebook has algorithms that feed users content they're likely to enjoy, previous research has found that people's own choices on the social network are a stronger influence on the sorts of opinions those individuals see. (Politics conversations on Twitter aren't much different.)

The new research, published today (March 6) in the journal Proceedings of the National Academy of Sciences, also found that choices matter. The study focused on the activities of 376 million Facebook users between January 2010 and December 2015, as those users interacted with what turned out to be 920 different news outlets. [Top 10 Golden Rules of Facebook]

Facebook engagement

By tracking likes, shares and comments on news stories posted on Facebook, researchers led by Walter Quattrociocchi of the IMT School for Advanced Studies in Lucca, Italy, determined what news sources people were engaging with and for how long.  

The most striking finding was that despite the huge number of news sources to choose from, Facebook users each typically fixated on just a handful of pages to engage with. And the more active the user was in doling out likes, shares and comments, the more likely that person was to focus his or her energy on fewer sources. The news outlets found in the study ranged from Reuters to Human Rights Watch to the Houston Chronicle, to niche publications like the Cyprus Expat.   

"There is a natural tendency of the users to confine their activity on a limited set of pages," Quattrociocchi and his colleagues wrote. "According to our findings, news consumption on Facebook is dominated by selective exposure."

Each person also looked at a limited constellation of news outlets, the researchers found. User activity clustered within certain subsets of news organizations, and there was very little cross-pollination between these subsets. (Someone sharing a lot of Greenpeace posts is probably not going to be engaging with the conservative The Daily Caller, for example.)

The study, based on a large dataset, is a welcome addition to the research literature on social-media polarization, said Ben Shneiderman, a professor of computer science at the University of Maryland who researches social media.

"It adds further evidence to confirm what we and others have seen, which is the so-called filter bubbles or the partitioned way that people get their information," Shneiderman, who was not involved in the new study, told Live Science.

Confirmation-bias clusters

Users were more cosmopolitan than news agencies themselves, however, the researchers noted, at least geographically. That is, while news pages can "like" one another or pass on each other's content, those networks were more geographically constrained than users' networks. Regular users tended to interact with more international, if still polarized networks of pages, the researchers said.

To see how these user interactions might arise, the researchers created a computer model in which individuals were given a predetermined opinion, represented by a number on a line. The model mimicked confirmation bias, or the tendency to elevate information you already agree with while picking apart information that challenges your assumptions; the computer model mimicked such bias by specifying that pages that differed too much from an individual's opinion number would be rejected. This computer version of confirmation bias resulted in patterns similar to those seen in the real world on Facebook, indicating how social network polarization might arise, the researchers said.

This user-generated confirmation bias could be a stumbling block for companies like Facebook or Google that are trying to stamp out so-called "fake news," the researchers said. The term "fake news" refers to completely false articles posted by businesses that aim to suck Facebook users into advertising-heavy web pages.

"News undergoes the same popularity dynamics as popular videos of kittens or selfies," Quattrociocchi and his colleagues wrote. What's more, the study authors wrote, political and social debates are based on conflicting narratives, and those narratives are resistant to strategies like fact checking. (Though recent research suggests that warning people to be on guard before they run into false information may be effective.)

People "form communities among friends, and their friends are tightly bound to each other but weakly bound to people outside their community," Shneiderman said. "So if there is a news story that is spread within their community, they're likely to believe it, and if there are challenges from without their community, they're likely to not know about it."

Original article on Live Science

Stephanie Pappas
Live Science Contributor

Stephanie Pappas is a contributing writer for Live Science, covering topics ranging from geoscience to archaeology to the human brain and behavior. She was previously a senior writer for Live Science but is now a freelancer based in Denver, Colorado, and regularly contributes to Scientific American and The Monitor, the monthly magazine of the American Psychological Association. Stephanie received a bachelor's degree in psychology from the University of South Carolina and a graduate certificate in science communication from the University of California, Santa Cruz.