NEW YORK – The supposed Facebook guidelines to moderators released by The Guardian say that the livestreams of acts of self-harm should not be removed because the social network “doesn’t want to punish people in distress.” Facebook has yet to
NEW YORK – The supposed Facebook guidelines to moderators released by The Guardian say that the livestreams of acts of self-harm should not be removed because the social network “doesn’t want to punish people in distress.”
Facebook has yet to respond to the alleged major leak published by the newspaper, dubbing the release ‘Facebook Files.’ It supposedly details how the social media giant moderates graphic content including violence, hate speech, terrorism, pornography and racism.
Amongst the hundreds of files reportedly seen by The Guardian are guidelines for dealing with self-harm that show how the company will allow users to livestream attempts to self harm because it “doesn’t want to censor or punish people in distress who are attempting suicide.”
“Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers,” one of the documents reportedly explains. “Removing self-harm content from the site may hinder users’ ability to get real-world help from their real-life communities.”
However, the footage will be removed “once there’s no longer an opportunity to help the person.”
Facebook also tries to contact agencies to carry out a “welfare check” when it detects that someone is attempting, or about to attempt, suicide.
Because of the risk of suicide “contagion” – the copycat effect where some people who see suicide are more likely to consider suicide themselves – moderators are instructed to “delete all videos depicting suicide unless they are newsworthy, even when these videos are shared by someone other than the victim to raise awareness.”
Facebook Live, which allows people to livestream video of whatever they wish, has seen several high-profile acts of violence since it was released. The social network has also come in for criticism for failing to deal with a spike in violent crimes, such as murder and rape, either streamed live or uploaded to the website.
Facebook to recruit 3,000 people to monitor content in wake of murder & rape videos. https://t.co/Wf3lFJLWag
— RT UK (@RTUKnews) May 4, 2017
Earlier this month, Facebook CEO Mark Zuckerberg announced the company would hire 3,000 additional people to monitor live videos and remove extremely inappropriate content, such as murder and suicide videos. Zuckerberg said the new staff will help Facebook “get better” at removing content like “hate speech and child exploitation.”
Facebook has issued a statement on the leaks, although it has yet to comment on the authenticity of the details in the documents.
“Keeping people on Facebook safe is the most important thing we do,” Monika Bickert, Facebook’s head of global policy management said, according to cnet. “In addition to investing in more people, we’re also building better tools to keep our community safe.”
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help.”