Facebook Moderation Guidelines Leaked, Show How It Reviews Hate Speech, Extremist Content

Facebook Moderation Guidelines Leaked, Show How It Reviews Hate Speech, Extremist Content
Highlights
  • Challenges such as revenge porn have overwhelmed Facebook moderators
  • It reviews around 6.5 million reports of potentially fake accounts a week
  • It confirmed that it was using software to intercept graphic content
Advertisement

Leaked Facebook documents show how the social media company moderates issues such as hate speech, terrorism, pornography and self-harm on its platform, the Guardian reported, citing internal guidelines seen by the newspaper.

New challenges such as "revenge porn" have overwhelmed Facebook's moderators who often have just ten seconds to make a decision, the Guardian said. The social media company reviews more than 6.5 million reports of potentially fake accounts a week, the newspaper added.

Many of the company's content moderators have concerns about the inconsistency and peculiar nature of some of the policies. Those on sexual content, for example, are said to be the most complex and confusing, the Guardian said.

Facebook had no specific comment on the report but said safety was its overriding concern.

Facebook to Hire 3,000 Reviewers to Screen Out Violent Videos

"Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech. This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously", Facebook's Head of Global Policy Management Monica Bickert said in a statement.

Facebook confirmed that it was using software to intercept graphic content before it went on the website, but it was still in its early stages.

The leaked documents included internal training manuals, spreadsheets and flowcharts, the Guardian said.

The newspaper gave the example of Facebook policy that allowed people to live-stream attempts to self-harm because it "doesn't want to censor or punish people in distress."

Facebook moderators were recently told to "escalate" to senior managers any content related to "13 Reasons Why," the Netflix original drama series based on the suicide of a high school student, because it feared inspiration of copycat behavior, the Guardian reported.

Reuters could not independently verify the authenticity of the documents published on the Guardian website.

© Thomson Reuters 2017

Comments

For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.

WannaCry Ransomware: North Korea Denies Role in Global Cyber-Attacks
Share on Facebook Gadgets360 Twitter Share Tweet Snapchat Share Reddit Comment google-newsGoogle News
 
 

Advertisement

Follow Us

Advertisement

© Copyright Red Pixels Ventures Limited 2024. All rights reserved.
Trending Products »
Latest Tech News »