Facebook, Instagram
(Courtesy: Twitter)

On Wednesday, Facebook released the fourth edition of its Community Standard Enforcement Report, detailing that the company is regulating and improving its main app and Instagram. For the first time, Facebook included Instagram in the report. 

The report focuses on ten policy areas and highlighted the removal of a wide variety of content including child nudity and abuse, adult nudity and sexual activity, bullying and harassment, hate speech, fake accounts, regulated good: drugs and firearms, suicide and self-injury, violent and graphic content, and terrorist propaganda from both the social media giants. 

11.6 million pieces of content related to child nudity and child sexual abuse were single-handedly taken down from Facebook. Over 754,000 pieces of similar nature were taken down from Instagram after Facebook had released its Community Standards. 

Another point highlighted in the report was the number of fake accounts on the social network increasing in numbers. 

Facebook announced that it has removed over 3.2 billion fake accounts between April-September along with taking action on 11.4 million hate speech posts in the same period. 

 The social media giant added that it has removed around 5.4 billion fake accounts and 15.5 million hate posts in total since January. 

“Over the past two quarters, we have improved our ability to detect and block attempts to create fake, abusive accounts. We can estimate that every day, we prevent millions of attempts to create fake accounts using these detection systems,” Facebook said on Wednesday. 

To help and educate its users, Facebook also published a new enforcement page that shows some examples of “how our Community Standards apply to different types of content and see where we draw the line.”

LEAVE A REPLY

Please enter your comment!
Please enter your name here