Source: Tes

The Facebook guidelines to its moderators were revealed by the British news outlet ‘The Guardian’ recently. The guidelines say that the live stream of acts of self-harm should not be deleted from the social media platform because the tech giant “doesn’t want to punish people in distress.”

Report on how Facebook moderates different types of content

The report released by the U.K.-based news outlet details how the social networking site moderates graphic content including terrorism, pornography, violence, racism, and hate speech. The social media giant has not yet responded to request for comment. Also, the tech company is keeping its silence on the alleged major leak published by the Guardian. The newspaper dubbed the release as “Facebook Files.”

One of the documents explains, according to the report, “Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers. Removing self-harm content from the site may hinder users’ ability to get real-world help from their real-life communities.” The document further says that the footage will be removed once there is no opportunity to help the person.

The hundreds of files which are reportedly seen by the UK-based newspaper contain guidelines for dealing with self-harm, hate speech, violence, etc. It shows how the social media company will enable users to livestream attempts to self-harm as it “doesn’t want to censor or punish people in distress who are attempting suicide.”

Facebook does this when it detects someone attempting suicide

When the social media giant detects that someone is about to attempt or is attempting suicide, it tries to contact agencies to carry out a welfare check. The moderators, notes the report, are instructed to “delete all videos depicting suicide unless they are newsworthy, even when these videos are shared by someone other than the victim to raise awareness” because of the risk of suicide contagion.

Suicide contagion is when a person sees suicide -in Facebook’s case, on Live Video – they are likely to consider suicide themselves as well. Facebook Chief Executive Officer, Mark Zuckerberg, announced earlier this month that the social media giant would hire 3,000 additional people to control and check live videos as well as delete extremely inappropriate content, like suicide and murder videos.

According to CNET, Monika Bickert, Facebook’s head of global policy management said, “Keeping people on Facebook safe is the most important thing we do. In addition to investing in more people, we’re also building better tools to keep our community safe.”  She added, “We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”

Comments

comments