Facebook secretly tracking “trust ratings” for users with “social score” algorithm right out of communist China

Thursday, August 23, 2018
By Paul Martin

by: Ethan Huff
Wednesday, August 22, 2018

Under the guise of combating “fake news,” social media giant Facebook has unveiled a new censorship scheme that basically allows the platform to either award or penalize users based on what it deems to be the overall “trustworthiness” of their browsing, liking, and sharing habits.

Similar to a social engineering program already being used throughout communist China to control people’s speech both online and off, Facebook says its new “trust ratings” system will track the online behavior of its users and assign secret ratings that ultimately determine whether or not users are “visible” online.

So-called “malicious actors,” as defined by Facebook, might have their content blanked out from other people’s Facebook timelines, for instance. Users who are determined by Facebook to be purveyors of “fake news” can expect to be similarly censored, or taken less seriously when reporting content to Facebook moderators – and all of this in the name of improving the “credibility” of the Facebook experience.

“One of the signals we use is how people interact with articles,” stated Tessa Lyons, head of Facebook’s anti-fake news initiative.

“For example, if someone previously gave us feedback that an article was false and the article was confirmed by a fact-checker, then we might weight that person’s future false news feedback more than someone who indiscriminately provides false news feedback on lots of articles, including ones that end up being rated as true.”

Is Facebook trying to become the communist China of online communication?

While Lyons says the Facebook trust ratings system “isn’t meant to be an absolute indicator of a person’s credibility,” it is being actively used as a metric in determining the overall “risk” of each Facebook user’s actions online.

What this means for the average Facebook user remains unclear, however. A company spokesperson told The Sun that the trust ratings system isn’t a “centralized ‘reputation’ score,” but rather a catch net for protecting the platform against users who might try to indiscriminately flag news they don’t like as being fake in an attempt to “game the system.”

“The reason we do this is to make sure that our fight against misinformation is as effective as possible.”

The Rest…HERE

Leave a Reply

Join the revolution in 2018. Revolution Radio is 100% volunteer ran. Any contributions are greatly appreciated. God bless!

Follow us on Twitter