Facebook's policies for moderating hate speech were exposed Wednesday in what appears to be the most in-depth investigations into how and why these decisions were made, ProPublicareported.
One of the most cringeworthy parts shows a slide asking, "Which of the below subsets do we protect?" It reads female drivers, black children, and white men. The section for white men shows a picture of the Backstreet Boys.
Tweet may have been deleted
The answer? The Backstreet BoysWhite men.
The policies, according to ProPublica, allowed moderators to delete hate speech against white men because they were under a so-called "protected category" while the other two examples in the above side were in "subset categories," and therefore, attacks were allowed.
Looks bad -- quite bad.
Facebook's moderation policies is a complicated system, where these "protected categories" are based on race, sex, gender identity, religious affiliation, national origin, ethnicity, sexual orientation and serious disability/disease, according to ProPublica. Meanwhile, black children wouldn't count as a protected category because Facebook does not protect age, and female drivers wouldn't count because Facebook does not protect occupation.
According to Facebook, it's policies aren't perfect.
“The policies do not always lead to perfect outcomes,” Monika Bickert, head of global policy management at Facebook, told ProPublica. “That is the reality of having policies that apply to a global community where people around the world are going to have very different ideas about what is OK to share.”
That's a similar excuse Facebook put forth in a blog Tuesday as part of its "Hard Questions" series. When asked about the ProPublicareport, Facebook pointed Mashable to this blog post.
In good news, Facebook is trying to better itself. That includes being more transparent about its practices, which comes only after reports like ProPublica's and The Guardian's recent "Facebook Files" series.
At least Facebook's come far. ProPublica revealed that back in 2008, when the social network was four years old, Facebook only had a single page for its censorship rulebook, and there was a glaring overall rule:
"At the bottom of the page it said, ‘Take down anything else that makes you feel uncomfortable,’” said Dave Willner, who had joined Facebook’s content team in 2008, toldProPublica.
Willner then worked to created a 15,000-word rulebook, which is still in part used today at the company. And yet, there remains to be many problematic areas on how the network polices itself. The Guardian's Facebook Files revealed numerous issues like how Facebook allows bullying on the site and other gray areas.
TopicsFacebookSocial Media
(责任编辑:綜合)
This app is giving streaming TV news a second try
Teacher absolutely nails it with new homework policy
Dog elected for third term as mayor of Minnesota town