Loading...

Facebook, Instagram file compliance report on content moderation

Facebook, Instagram file compliance report on content moderation
Photo Credit: 123RF.com
Loading...

On Friday, Facebook followed Google and Koo to file its first monthly compliance report in India in keeping with the Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021

The report which tracks the period of May 15 to June 15 across Facebook and Instagram says that Facebook proactively took action against 25 million pieces of content classified as spam, 2.5 million content marked violent and graphic content with 99.9% proactive rate. The platform also removed 1.8 million pieces of content under its policies for adult nudity and sexual activity with 99.6% proactive rate. 

The total pieces of content against which the social media platform took action stood at 5.05 million.

Loading...

Action taken can range from removing content from the platform, or covering disturbing images or video with warning. 

In its report, Facebook specifies proactive rate as the percentage of content or accounts flagged by Facebook itself, before user reports, based on machine learning technology. “We use this metric as an indicator of how effectively we detect violations,” said the report.

In a statement issued to the media, a Facebook spokesperson said, “Over the years, we have consistently invested in technology, people and processes to further our agenda of keeping our users safe and secure online and enable them to express themselves freely on our platform.”

Loading...

The spokesperson further added, “We use a combination of Artificial Intelligence, reports from our community and review by our teams to identify and review content against our policies. We’ll continue to add more information and build on these efforts towards transparency as we evolve this report.” 

For its group company Instagram the total number of content pieces against which it took action totalled 2.03 million. These included 699,000 pieces of content portraying suicide and self-injury and 668,000 pieces of content which were violent and graphic in nature, according to the company’s community guidelines.

The company has said that it will be publishing a separate report on July 15 with details on complaints received from users and action taken. The company has said that it will publish these reports with a periodicity of 30-45 days.

Loading...

Sign up for Newsletter

Select your Newsletter frequency