Leaked Documents Reveal Facebook’s Biased, Convoluted Censorship Policies

Monday, December 31, 2018
By Paul Martin

by Tyler Durden
ZeroHedge.com
Mon, 12/31/2018

Facebook’s thousands of moderators have been relying on outdated, inaccurate and biased “maze of PowerPoint slides” to police global political speech, according to a trove of 1,400 internal documents obtained by the New York Times.

Moderators say they often rely on Google Translate to read posts, while facing pressure to make decisions on acceptable content within a matter of seconds, according to the report.

The guidelines – which are reportedly reviewed every other Tuesday morning by “several dozen Facebook employees who gather over breakfast,” are filled with “numerous gaps, baises and outright errors,” according to the Times.

Moderators were once told, for example, to remove fund-raising appeals for volcano victims in Indonesia because a co-sponsor of the drive was on Facebook’s internal list of banned groups. In Myanmar, a paperwork error allowed a prominent extremist group, accused of fomenting genocide, to stay on the platform for months. In India, moderators were mistakenly told to flag for possible removal comments critical of religion. -NYT

The guidelines, set by “mostly young engineers and lawyers,” must be interpreted by Facebook’s fleet of mostly outsourced moderators which employ largely unskilled workers, “many hiredo out of call centers.”

Moderators express frustration at rules they say don’t always make sense and sometimes require them to leave up posts they fear could lead to violence. “You feel like you killed someone by not acting,” one said, speaking on the condition of anonymity because he had signed a nondisclosure agreement. -NYT

According to Facebook executives, they are working diligently to rid the platform of “dangerous” content.

“It’s not our place to correct people’s speech, but we do want to enforce our community standards on our platform,” said Facebook senior News Feed engineer. “When you’re in our community, we want to make sure that we’re balancing freedom of expression and safety.”

The company’s head of global policy management, Monika Bickert, meanwhile, said that the company’s primary goal was to prevent harm – though perfection “is not possible.”

“We have billions of posts every day, we’re identifying more and more potential violations using our technical systems,” said Bickert. “At that scale, even if you’re 99 percent accurate, you’re going to have a lot of mistakes.”

The Rest…HERE

Leave a Reply

Join the revolution in 2018. Revolution Radio is 100% volunteer ran. Any contributions are greatly appreciated. God bless!

Follow us on Twitter