Following his announcement of changes to the Facebook News Feed two weeks ago, CEO and founder Mark Zuckerberg announced Friday an additional change included on the platform, this time to news sources on the platform.
According to a post on Zuckerberg’s Facebook page, the social media platform will “will now ask people whether they’re familiar with a news source and, if so, whether they trust that source” in its quality surveys. Zuckerberg said the effort is an attempt to help the platform promote “high quality news that builds a sense of common ground.” Users can already provide feedback on posts in their feed, with the option to report a post as a “false news story.”
Community feedback is part of an ongoing 2018 update to Facebook to “make time on Facebook well spent” by promoting active conversation and engagement.
Facebook was a major source of false news stories and Russian-backed propaganda during the 2016 election season. In October, reports suggested roughly 126 million users in the United States may have seen posts and content on the platform created by Russian-government backed trolls around Election Day. The latest news source update is likely another attempt by Facebook to solve its news problem.
Following the announcement, the idea faced criticism on various social media platforms, with many suggesting that bot and troll accounts would skew the data, or that the surveys would give users the opportunity to reconfirm the biases.
Here, let me fix that for you:
Facebook Will Ask Users to Confirm Their Biases, Changing Nothing.
— ArtForStrangers (@ArtForStrangers) January 19, 2018
Dr. Itai Himelboim, an associate professor of advertising at the University of Georgia, studies the role of social media in news and political communication and examines information flow.
Himelboim said user feedback may be useful in the fight against fake news proliferating on social media platforms.
“On Facebook, you’re not just seeing news that you follow or like directly. You’re also seeing news articles shared by friends which may come from sources that the user themselves does not trust,” he said. “When it comes to ‘fake news,’ that’s often how fake news spreads.”
In terms of partisanship and user biases, Himelboim said the rating system may not be very effective.
“Clearly there are a lot of people that like information sources they agree with,” he said. “Even if it’s a little out there, if it doesn’t shake your political ideology or how you see the world, you’re more likely to trust it, because that makes sense to you.”
This leads to ‘selective exposure,’ he said, where people are more likely to select news sources that they already agree with.
“That also applies to friendships; you’re more likely to be friends with someone who has similar views as you,” he said. “So it’s already creating silos of information flow.”
Though social media has played a major role in selective exposure, the issue is not new, Himelboim said.
“There was a period in the 70s when most American households watched one of the three major news networks,” he said. “But as cable news and the 24-hour news cycle moved in, that’s when people had the opportunity to expose themselves to news sources they agree with.”
As a result, political discourse and cooperation has become more difficult.
In terms of Facebook’s attempt to use community feedback to decide what news shows up on a user’s feed, Himelboim said it was a slippery slope.
“On one hand, you say, let’s find find out what news sources people trust and why,” he said. “On the other hand, there’s a lot of Big Brother here deciding where the line is between fake and partial news, partial all the time and partial some of the time.”
Zuckerberg’s full announcement post is below.
Jumpstart a career doing something you are passionate about with one of College Media Network’s courses. Read about our current offerings, schedule and unique virtual learning environment here.