Someone scrolls down their news feed and they come across the same article four or five times. The same article has been shared by most of their friends. It’s like nothing else exists.
They’re in a filter bubble, a microcosm of information determined by the social media site’s newsfeed algorithm. These algorithms often reorganize people’s feed based on how someone has interacted with the site before: if someone has liked a lot of cat photos, it’ll put cats at the top of their feed.
“Your filter bubble is your own personal, unique universe of information that you live in online. And what’s in your filter bubble depends on who you are and it depends on what you do,” said internet activist Eli Pariser in his 2011 Ted Talk. “But, the thing is that you don’t see decide what gets in. And more importantly, you don’t see what gets edited out.”
Social media sites use a variety of algorithms to customize the content and order of people’s newsfeed. They are often based on what someone has liked and commented on. However, this practice often hides content that one may disagree with, creating a bubble of posts and advertisements that are tailored to their tastes.
This is known as a filter bubble, as coined by Pariser. By nature, filter bubbles only reflect someone’s own interests. Because of this, their timeline often inaccurately represents the real world.
Although this experience can change from person to person and network to network, many people have noticed some sort of rearranging going on. According to Facebook’s ad targeting policy, advertisers can target audiences based on location, gender, and even age.
“I use social media to talk to people, and sometimes for news,” said sophomore Yarah Meijer. “It doesn’t seem to be targeted.”
Although some may not realize it, there are a wide variety of factors that are used to filter someone’s timeline, according to Facebook and Twitter’s newsfeed algorithm policies. In addition to targeting based on characteristics about the user, advertisements and content are determined from previous articles someone has clicked, as well as those of their followers.
When it comes to politics, this can skew the bias of news one sees in their newsfeed. Articles and advertisements that are biased against a particular group of people such as a certain political party can increase tensions.
“A lot of the people I follow tend to be on the liberal side, so they’re Democrats and they usually say derogatory things about Republicans,” said sophomore Marina Gasparini. “[On] accounts that are similar to Snapchat, people are open about political views. But [sites like] Instagram are more permanent so they’re not as straightforward,” Gasparini said.
Often times, this echo chamber of sorts can lead to the spread of misinformation. Critics say that social media networks are to blame for this. Recently, Facebook has come under fire because of its policies allowing fake news to be spread easily on the site, according to The New York Times.
Critics claimed that inaccurate articles misled voters and may have influenced the outcome of the election. Facebook responded by changing their Audience Network policy, clarifying their stance on fake news. This has left some wondering how far Facebook will go in curbing fake news stories.
“The Onion, which is clearly satire, shouldn’t be filtered out,” said sophomore William Yonts. “But click bait articles or lies should be because Facebook is how a lot of people find out about the world.”
According to the Pew Research Center, 66 percent of Facebook’s users get their news on the site. Because so many people rely on Facebook for news, the ripple effect of fake news combined with personalized news feeds could have significant implications on how media is consumed.
One person may share a falsified article on Facebook, which, because of Facebook’s timeline algorithm, turns up at the top of their followers’ feeds. Those people re-share the inaccurate information and soon there is a sizable community of misinformed citizens. Right now, the networks themselves aren’t doing enough to curb this. If the social media networks won’t do it themselves, it’s up to the users to verify the content of the news that is shared.
“So if algorithms are going to curate the world for us, if they’re going to decide what we get to see and what we don’t get to see, then we need to make sure that they’re not just keyed to relevance,” Pariser said. “We need to make sure that they also show us things that are uncomfortable or challenging or important.”