Leak: Secret Facebook rules on what violence, self-harm and child abuse can be posted

Leak: Secret Facebook rules on what violence, self-harm and child abuse can be posted

Facebook allows users to livestream self-harm, post videos of violent deaths and photos of non-sexual child abuse, but comments which threaten to harm

Seven GDPR Complaints Filed Against Google Over User Location Tracking
Is your company part of the GDPR ‘mobile loophole’?
Ajit Pai’s FCC Lied About DDoS Attack

Facebook allows users to livestream self-harm, post videos of violent deaths and photos of non-sexual child abuse, but comments which threaten to harm President Donald Trump are to be deleted, according to Facebook’s secret rule books for monitoring what its 2 billion users can post.

The Guardian got hold of leaked copies of over 100 internal Facebook manuals and documents that tell moderators how to handle content which includes violence, sex, hate speech, terrorism, nudity, self-harm, revenge porn and more controversial content – even cannibalism.

The giant social network has increasingly come under fire for how it handles disturbing content and for depending too heavily on users to report such content. At the beginning of May, Facebook CEO Mark Zuckerberg announced the company would hire 3,000 more people – on top of the 4,500 moderators it had – “to review the millions of reports we get every week.”

The leaked internal guidelines were given to Facebook moderators “within the last year,” the Guardian said. The documents show the fine line Facebook teeters on when deciding what content to censor without being accused of squashing free speech.

10 seconds…that’s about how long Facebook moderators have to decide if content should be removed, according to the Guardian. The internal manuals for moderators give examples of what to censor when it comes to graphic violence, animal abuse, credible threats of violence, non-sexual child abuse and more.

Credible threats of violence

Leaked documents show that the following call for violent action is allowed: “To snap a b*tch’s neck, make sure to apply all your pressure to the middle of her throat.” But commenting “Someone shoot Trump” is not and should be deleted since he is a head of state.

Self-harm

Facebook, which purportedly has received over 5,000 reports of potential self-harm in a two-week period, says it is OK for users to livestream attempts to self-harm. According to an internal policy update, moderators were told: “We’re now seeing more video content – including suicides – shared on Facebook. We don’t want to censor or punish people in distress who are attempting suicide.”

However, Facebook will try to get other agencies to do a “welfare check” when a person is attempting suicide. Once there is no chance of helping that person anymore, the video is removed.

Graphic violence

Videos of violent deaths help create awareness, Facebook believes. The footage should be marked as disturbing and “hidden from minors,” but not automatically deleted since the videos can “be valuable in creating awareness for self-harm afflictions and mental illness or war crimes and other important issues.”

Images of animal abuse are allowed for awareness purposes, but “extremely disturbing” photos of animal mutilations and videos of torturing animals are to be marked as disturbing. If the violence against animals is sadistic or celebratory, then it is not allowed and is deleted.

Child abuse

Facebook allows videos of child abuse to be posted, as long as it is non-sexual and marked as “disturbing.” Videos or photos of child abuse which are shared with sadism and celebration are removed. Imagery of child abuse is allowed unless the child is naked.

Nudity

Nudity is allowed if it is a “newsworthy exceptions” or if it is “handmade art.” Digitally created art showing sexual activity as well as revenge porn are not allowed. Facebook also allows videos of abortions as long as there is no nudity in the footage.

Facebook won’t confirm if the documents obtained by the Guardian are authentic, but Facebook released the following statement:

Keeping people on Facebook safe is the most important thing we do. In addition to investing in more people, we’re also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.

Go to Source

COMMENTS