• Breaking News

    THE GOLDEN GIST. Best online updates blog

    NEWS | GIST | ENTERTAINMENTS | EDUCATION | SPORTS | POLITICS

    LIKE OUR FACEBOOK PAGE

    Friday 16 December 2016

    Will Facebook's Fake News Warning Become a Badge of Honor

    Scroll through your Facebook feed, and you may soon come across an angry red warning symbol. “Disputed by 3rd Party Fact-Checkers,” the alert will read, splashed underneath a link posted by a friend. They will soon start popping up across the social-media site for some users, thanks to a new set of features Facebook is testing to battle hoaxes and fake news online.

    How you react will depend on how you feel about Facebook, the media, fact-checking, and the “fake news” phenomenon.

    Maybe you’re a seasoned media connoisseur. Even without the warning label, you’d have known that the link was fake, either because of the fishy URL, or because of the outrageousness of the headline, or because the site itself was riddled with misleading advertisements and other stories engineered to get clicks and views. You scroll on.

    Or maybe, instead, you’re an avid Facebooker, and get most of your news from friends and social media. You read a little bit of everything, and might not pay too close attention to the name and design of a news site if you got to it from Facebook or an emailed link. Maybe you would appreciate a feature that helps you sort through the noise of your feed. Now, an authoritative source—Facebook’s fact checks will come from Snopes, PolitiFact, FactCheck.org, and ABC—is telling you that the news story is false, and that’s what you needed to be sure. You move on, perhaps noting the offending domain so that you don’t fall for any of their other hoaxes.

    If some people see the warning in this light, it loses power.
    But perhaps you fall into a third category. You’re intensely suspicious of the mainstream media, which you think is largely controlled by liberals, or corporations, or unseen billionaires. Maybe you think the media has twisted the “fake news” label to apply it to stories they just don’t like, or want to discredit.

    For people in this last category, Facebook’s warning may be infuriating. They may think this fact-checking is a way to stifle some of their favorite sites, and they might even see it as an attack on good old American free speech. If some people see the warning in this light, it loses power. Some may tell their friends to ignore the warning and read the story anyway. (If they want to post the same story on their own Facebook page, though, they'll have to click through another alert that reads: “Before you share this story, you might want to know that independent fact-checkers disputed its accuracy.”)

    It’s not totally clear how many internet users fall into each of these categories, but a report from Pew Research Center released Thursday offers some hints. According to Pew, 64 percent of Americans think fake news creates “a great deal of confusion” around basic facts. A further 24 percent think it creates “some confusion,” and 11 percent say it creates “not much” nor no confusion at all.

    That 11 percent of people might bristle to see Facebook labeling stories as fake, if they don’t believe fake news is a problem at all. (But since Pew didn’t tightly define what “fake news” is, some people who think that reputable sources like The New York Times peddle fake news may be counted among those saying fake news causes confusion.)

    There’s a danger that people who are disinclined to trust traditional sources of information will treat Facebook’s warnings as a badge of honor. If fact-checking organizations deem a story questionable,  they might be more likely to read and share it, rather than less. There's reason to believe this group might think of itself as a counterculture, and take the position that anything that “the man” rejects must have a grain of subversive truth to it.

    But Facebook’s new features will make it difficult for fake news to travel far. People will easily and anonymously be able to report a link as fake, flagging it for review by fact-checkers. Stories marked as “disputed” will be be less likely to show up in others’ news feeds, and can’t be promoted or made into an ad. The company is even experimenting with monitoring how users interact with story to determine whether or not the story is legitimate: If people who read a certain story are much less likely to share it, Facebook says, that might be a clue that the story is misleading.

    Apart from the plan to monitor sharing habits, all the new features Facebook announced Thursday are driven by humans. The company has had a fraught relationship with using human editors to moderate content on its site. In August, it fired all the editors who were in charge of the “trending” module found in the top-right corner of the site after the site was accused of suppressing conservative news. But the inability of machines to match human judgement became immediately clear, when the top trending story the following week was factually inaccurate.

    The alerts will be useful for people who don’t have the experience to fact-check news stories on their own.
    Mark Zuckerberg, Facebook’s CEO, softened his previous stance on Facebook’s complete neutrality as a platform for sharing information in a post he published Thursday. “While we don't write the news stories you read and share, we also recognize we’re more than just a distributor of news,” Zuckerberg wrote. He went on to say that Facebook has “a new kind of responsibility” to keep people informed.

    But the company is walking a narrow line. In a world where some claim that there are no such things as facts, live fact-checking can be a contentious topic. By bringing on third-party reviewers rather than hiring its own checkers, Facebook hopes to offload the most difficult judgment calls onto reputable third parties. “With any changes we make, we must fight to give all people a voice and resist the path of becoming arbiters of truth ourselves,” Zuckerberg wrote.

    The changes may make some users angry, and might cause them to dig in their heels. But Facebook seems to be a specific group with this move: Those who just need a push in the right direction in order to avoid bad information online. The Pew survey found that 15 percent of Americans are not very confident or totally unconfident in their ability to recognize fake news on the internet. (Previous studies, like this one from Stanford, tested young internet users on media literacy, and found that they had trouble differentiating between hard news, opinion writing, and misinformation.)

    A clear warning on disputed stories would be particularly useful for people who don’t have enough background or expertise to fact-check news stories on their own. And if enough people turn their back on fake news, not only will financial incentives dry up for its purveyors, but the echo chamber for sharing them will shrink, robbing the hoaxes of some of their authority.

    No comments:

    Post a Comment