Facebook users clicking on a link to read a story about Michelle Obama’s encounter with a 10-year-old girl whose father was jobless were in for a shock, the Boston Globe reported yesterday.
Facebook responded to the click by offering what it called “related articles.” These included one alleging a Secret Service officer had found the president and his wife having “S*X in Oval Office,” and another that said “Barack has lost all control of Michelle” and was considering divorce.
Facebook did not try to defend the clearly false content, said the paper. A spokeswoman said there was a simple explanation for why the stories were pushed on readers: algorithms.
The stories, in other words, said the Globe were apparently selected by Facebook “based on mathematical calculations that rely on word association and the popularity of an article. No effort is made to vet or verify the content.”
A Globe reporter came across the Michelle Obama links by clicking on an Associated Press story that had been posted on Facebook by The Globe. That story was legitimate; it told how Michelle Obama accepted a resume for the jobless father of a 10-year-old girl who met the presidential spouse at the White House.
As soon as the link to that story was clicked, however, Facebook offered what it called three related articles, said the Globe .
The link to a story about the first couple’s supposed encounter in the Oval Office led to an article that was clearly fake and was “filled with language not suitable for a family newspaper,”said the Globe. The link to the story saying that the president had “lost all control” of his wife quoted a supposed insider saying the first couple were “considering divorce.”
A third link, to a story saying that president’s wife “has no dignity,” was a piece of commentary.
The White House declined comment on the portrayal of the Obama family.
Experts told the Globe Facebook should immediately suspend its practice of pushing so-called related articles to unsuspecting users unless it can come up with a system to ensure that they are credible.
“They have really screwed up,” said Emily Bell, director of Columbia Journalism School’s Tow Center for Digital Journalism. “If you are spreading false information, you have a serious problem on your hands. They shouldn’t be recommending stories until they have got it figured out.”
The incident is important, Bell said, because it illustrates the danger of having a company such as Facebook become one of the world’s most widespread purveyors of news and information.
The website relies on the idea that people trust stories posted by friends. But this recent practice, announced last December, is a departure from that ethos because no human being, much less a friend, vets related articles that are posted as a result of Facebook’s algorithms.
Facebook last month announced that it is creating its version of a news service, called FB Newswire, based on social media information that it promises to verify with its partner, Storyful. These verified stories would be offered to news organizations around the world, further expanding Facebook’s influence on the way people get their news.
Storyful said on its website that it would ensure that stories are verified before they are posted on the news service, pledging that it would be “debunking false stories and myths.”
The Globe said that only underscores questions about why Facebook does not similarly try to verify or debunk stories that it pushes to readers as related articles.
Asked to respond, a Facebook official made clear that the company does not apply the same fact-checking standard when offering readers related stories on their news feed, such as the ones about the Obamas.
“These news feed units are designed to surface popular links that people are sharing on Facebook,” Facebook spokesman Jessie Baker said via e-mail. “We don’t make any judgment about whether the content of these links are true or false, just as we don’t make any judgment about whether the content of your status updates are true or false.”
Nicholas Diakopoulos, a fellow at Columbia’s Tow Center who has studied the way major websites rely on data to disseminate information, said that it is not a defence for Facebook to say that it relies on algorithms when posting “related stories.”
He said that humans devise the algorithms and are responsible for their quality. An algorithm, for example, can be designed to accept stories only from a list of trusted sources. By allowing related articles from obscure and unreliable sources, Diakopoulos said, Facebook is offering its huge platform but ceding control of the content.
Google takes a different approach in compiling articles for its Google News service. In addition to using algorithms, the company said it requires news organisations to meet rigorous standards for inclusion.
Source : thedrum.com