CNN News:脸谱网新闻推送为半人工 陷政治偏见风波
AZUZ: You know when you're Facebook, and stories up on your trending feed. The company says that the employees who review those stories are not allowed to prioritize one political view over another.
Here's why the company is talking about this. Last week, the technology news site Gizmodo published a report that accused Facebook of promoting a liberal news bias. According to the report, reviewers of the site's trending news section allegedly kept stories about conservative subjects from appearing in people's feeds, even though they were actually trending among users.
Facebook CEO Mark Zuckerberg says his company is investigating the report and that it's found no evidence that this happened. And if it does, Facebook will take steps to address it.
LAURIE SEGALL, cnn SENIOR TECH CORRESPONDENT: Ever wonder how a topic becomes trending on Facebook, you know the column "Hot Topics" to the right of your newsfeed?
Well, here's how it works: according to Facebook's head of search, these topics are surfaced by an algorithm. Now, the idea is to flag most popular conversations happening on Facebook. And also there's this human element, the topics are reviewed by a team of curators who make sure they're actually trending in the real world.
So, here are the guidelines: curators are supposed to weed out hoaxes, spam and duplicate topics. A former curator said there's transparency in these decisions. Employees actually have to write out the reasons they blacklist a topic. I'm also told that some curators may surface a conversation that might not yet be trending. This is during a breaking news event.
So, here's a controversy. According to a Gizmodo report, former Facebook contractors say curators routinely suppressed stories about conservative issues. In a statement, Facebook says, no, they don't prioritize one viewpoint over another.
But is the human element inadvertently biasing what we see on Facebook?
People consume news from Facebook. The company has this power to serve as what we see and also what we don't see.
So, this report raises an important question. Could the folks on the team have subconsciously or subtly steered away from conservative stories? And if so, what responsibility does Facebook have?