The “Facebook Files” story overview

One of today’s trending stories puts the spotlight on Facebook’s methodology to handle content related to sex, terrorism and violence. The story focuses on a recent article from The Guardian titled Facebook moderators: a quick guide to their job and its challenges, part of the “Facebook Files” series that relies on leaked information regarding Facebook’s secret rules and guidelines used by its moderators containing “more than 100 internal training manuals, spreadsheets and flowcharts that give unprecedented insight into the blueprints Facebook has used to moderate issues such as violence, hate speech, terrorism, pornography, racism and self-harm”.

TrustServista is currently analyzing this story, revealing how it began and what are the elements that other publications focus on.

The centerpiece of this story is The Guardian article mentioned above (published 21st of May 2017), which is heavily referenced by other publications picking up the information: The Telegraph, New York Post, BBC, NY Daily News, The Independent or Russia Today.

Currently the focus of most of the recent stories is on how Facebook is allowing videos depicting self-harm, violent death or abortion. The New York Post offers an insightful piece on this angle, citing the article from The Guardian as Patient Zero, determined by TrustServista with precision even through the high number of non-related referenced articles:

 

Other related stories have merged with this one, such as “Facebook misses Thai deadline to remove critical content” (BBC) or “EU fines Facebook 110 million euros over misleading WhatsApp data” (Reuters).

What is interesting in this story is that one Russia Today article analyzed here has a Patient Zero dating back 18 days ago (https://twitter.com/RTUKnews/status/860232569224298496) showing an intent from Facebook to tackle rape and murder videos, with Facebook CEO Mark Zuckerberg declaring that self-harm videos are also a concern to the social media platform:

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook – either live or in video posted later […] If we’re going to build a safe community, we need to respond quickly.  Over the next year, we’ll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly.”