Facebook's first transparency report shows that the majority of offending content was removed before being reported


facebook f logo puzzle ss 1920 - Facebook's first transparency report shows that the majority of offending content was removed before being reported

Facebook released its first ever Transparency Report, describing the amount of content that it has identified as breaking the rules of the community standard between October 2017 and March 2018. According to the data, Facebook has taken measures against a majority of the content offensive before it is reported by users.

The report is part of the Community Standards Initiative announced for the first time in April. Criteria for violation were divided into six categories: graphic violence, nudity and sexual activity, ISIS terrorist propaganda, Al Qaeda and its affiliates, hate speech, spam and fake accounts.

Facebook says that it uses a combination of machine learning automation and employees to identify content that violates its guidelines on community standards. The company said several times that it planned to recruit at least 10,000 security and safety professionals by the end of 2018 to work on this initiative.

Facebook's transparency report divides the number of content violations it has responded to for each category between the fourth quarter of 2017 and the first quarter of 2018 and the amount of content identified before users report it. It also lists the frequency of content violations in categories of graphic violence and nudity and sexual activity, as well as the frequency of false counts. Here is an overview of the data:

How many content or Facebook accounts has it taken?

  • Graphic violence – Q4 2017: 1.2 million | Q1 2018: 3.4 million
  • Nudity and sexual activity – Q4 2017: 21 million | Q1 2018: 21 million
  • Terrorist propaganda of the IS, al Qaeda and its affiliates – Q4 2017: 1.1 million | Q1 2018: 1.9 million
  • Hate speech – Q4 2017: 1.6 million | Q1 2018: 2.5 million
  • Spam – Q4 2017: 727 million | Q1 2018: $ 837 million
  • False counts – Q4 2017: 694 million | Q1 2018: 583 million

Quantity identified before users report content or accounts

  • Graphic violence – Q4 2017: 72% | Q1 2018: 86%
  • Nudity and sexual activity – Q4 2017: 94% | Q1 2018: 96%
  • Terrorist propaganda of the Islamic State, al-Qaeda and its affiliates – Q4 2017: 97% | Q1 2018: 99.5%
  • Hate speech – Q4 2017: 24% | Q1 2018: 38%
  • Spam – Q4 2017: 100% | Q1 2018: 100%
  • False counts – Q4 2017: 98.5% | Q1 2018: 99.1%

Prevalence of content in violation of Facebook's community standards

  • Graphic Violence – Q4 2017: 0.16% to 0.19% | Q1 2018: 0.22% to 0.27%
  • Nudity and sexual activity – Q4 2017: 0.06% to 0.08% | Q1 2018: 0.07% to 0.09%
  • Terrorist propaganda of the Islamic State, al-Qaeda and its affiliates – Data unavailable
  • Hate speech – Data unavailable
  • Spam – Data not available
  • False accounts – Facebook estimates that fake accounts accounted for about 3 to 4% of monthly active users (MAU) on Facebook in the first quarter of 2018 and the fourth quarter of 2017.

In all categories, with the exception of a hate speech, Facebook has taken action against most of the offending content before it is reported by users. For hate speech, there was a noticeable difference in the percentage of proactively deleted Facebook content. More than 90% of the content was removed without being reported in almost all categories except hate speech. For both quarters, less than 40% (38% in the first quarter of 2018 and 24% in the fourth quarter of 2017) of content identified as hate speech were the subject of action prior to Be reported. This means that more than half of the hate speech violations identified on the platform must be reported by a user in relation to Facebook, identifying them through their own systems.

Facebook notes at the beginning of the report that it is still refining its internal methods to measure its efforts and expects the numbers to become more accurate over time.


About the author

1508883209 976 paypals new marketing solutions tool highlights how buyers are using the online payment platform - Facebook's first transparency report shows that the majority of offending content was removed before being reported

Amy Gesenhues is the general reporter for Third Door Media, covering the latest news and updates for land and search engine marketing. From 2009 to 2012, she was an award-winning columnist for a number of dailies in New York, Texas. With over ten years of experience in marketing management, she has contributed to a variety of traditional and online publications, including MarketingProfs.com, SoftwareCEO.com, and Sales and Marketing Management Magazine. Read more articles from Amy.

What's Your Reaction?

Cute Cute
0
Cute
Geeky Geeky
0
Geeky
LOL LOL
0
LOL
Love Love
0
Love
OMG OMG
0
OMG
WIN WIN
0
WIN
WTF WTF
0
WTF
Like Like
0
Like
Dislike Dislike
0
Dislike
Damn Damn
0
Damn
Angry
0
Angry
Cry
0
Cry

0 Comments

Your email address will not be published. Required fields are marked *

You may also like

More From: Digital Marketing

DON'T MISS

Choose A Format
Personality quiz
Series of questions that intends to reveal something about the personality
Trivia quiz
Series of questions with right and wrong answers that intends to check knowledge
Poll
Voting to make decisions or determine opinions
Story
Formatted Text with Embeds and Visuals
List
The Classic Internet Listicles
Countdown
The Classic Internet Countdowns
Open List
Submit your own item and vote up for the best submission
Ranked List
Upvote or downvote to decide the best list item
Meme
Upload your own images to make custom memes
Video
Youtube, Vimeo or Vine Embeds
Audio
Soundcloud or Mixcloud Embeds
Image
Photo or GIF
Gif
GIF format
%d bloggers like this: