[SAN FRANCISCO] Facebook Inc's methods for determining trending news topics rely heavily on human input and company rules, according to documents released by the social network, contradicting some of its earlier statements that the system is mostly machine-based.
Tom Stocky, Facebook's executive in charge of the team, on Monday said the company doesn't "insert stories artificially" into the trending topics feature, while the documents say that news topics can be inserted by human editors.
Mr Stocky also said the guidelines don't permit the prioritization of one news outlet over another; the documents show Facebook checks its news against 10 news organizations to measure if a trending topic is legitimate.
Facebook released the internal documents after they were cited in a report by the Guardian. The company has been embroiled in a dispute about whether it suppresses conservative viewpoints in the trending topics feature, set off by a report about internal bias this week by Gizmodo.
Though the company denied the report, which cited anonymous former contract workers, scrutiny of the social network's process has been increasing, with US Senator John Thune sending chief executive officer Mark Zuckerberg a letter inquiring about the allegations and requesting answers by later this month.
The guidelines are "checks and balances" to help Facebook display stories that are actually newsworthy, rather than just popular on its site, helping it build a better product, said Justin Osofsky, vice president of Facebook's global operations.
"Facebook does not allow or advise our reviewers to systematically discriminate against sources of any political origin, period," Mr Osofsky said in a statement. "The intent of verifying against news outlets is to surface topics that are meaningful to people and newsworthy."
Still, the documents make clear that Facebook's system is more human than advertised, making the company susceptible to the same potential biases and criticism about its judgment as any news organization.