Facebook has been steadily vamping up its battle against false news on the platform with a number of measures undertaken in recent months, the professor notes. Facebook first hired a former CNN anchor to lead a newly established news partnerships team, as well as a New York Times veteran to head the news products division. Then came several technical changes and extra features to the site.
One change is the designation of articles that Facebook deems misleading or fraudulent with a “disputed” label.
“But how effective are these measures?” Nyhan asks.
Apparently, not so much. The professor lists and details two studies that show such labeling is not a valuable way to thwart deceptive or fake content.
An experiment by two Yale professors, Gordon Pennycook and David Rand, for example, revealed previous exposure to false information will generate a sense that the information is more likely to be accurate. After designations such as “disputed” are included, people will often still treat information as legitimate, they found.
The two Yale scholars also discovered through another project that any stories without the label will usually be considered as more accurate.
“If Facebook is seen as taking responsibility for the accuracy of information in its news feed through labeling, readers could start assuming that unlabeled stories have survived scrutiny from fact checkers (which is rarely correct — there are far too many for humans to check everything),” Nyhan wrote.
Readers in his own test seemed more wary of accurate headlines and news article when given reminders to “think critically” and “remain skeptical,” he wrote.
Facebook has partnered with “fact-checking” organizations like PolitiFact and Snopes to help with news analysis capabilities. Snopes employs liberals and leftists almost exclusively, according to a Daily Caller investigation from December.
Facebook itself dealt with similar accusations of fairly uniform internal viewpoints on policy and personal politics.
Facebook should be more open and honest about their fact-checking endeavors given the risk of confusing readers, as well as inappropriate suppression or censorship of particular information, Nyhan thinks.
“As the studies above suggest, changing human beliefs is far harder than providing input to a computer program,” Nyhan wrote. “But it seems we should be cautious about placing too much trust in a private algorithm that works most effectively by suppressing information, especially without further evaluation of its effects.”
Disclaimer: News articles on this site may contain opinions of the author, and if opinion, may not necessarily reflect the views of the site itself or the views of the owners of NewsLI.com, Long Island Media Inc., or Long Island Exchange®. For more information on our editorial policies please view our terms of service.