Social media platform Koo proactively moderated 54,235 pieces of content between June 1 and June 30, 2021, according to its first compliance report.
All major social media intermediaries are mandated to file a monthly report of complaints received and action taken, as well as content removed using automated monitoring, as part of Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules 2021.
However, Koo did not respond to a query by TechCircle on whether it had used automation for proactive moderation of content. The platform removed 1,996 of the posts or “Koos” while other action was taken against the remaining identified for moderation.
On Wednesday, Google filed its first compliance report for India for the month of April 2021 and Facebook said it will be publishing compliance report for the platform and its group company WhatsApp by July 15.
A statement issued by Koo said that the platform received 5,502 complaints from its community of which 22.7 % pieces of content were removed while other actions including overlaying, and blurring images, ignoring information or warning tags were appended to the rest. However, the platform did not specify how many complaints were received for languages other than English.
“As Koo gains tractions across India, we will ensure that Koo respects the law of the land and meets the requirements, enabling every country to define its own digital ecosystem. This Compliance Report is one step in that direction,” said Aprameya Radhakrishna, CEO of Koo in the statement.
As part of the intermediary guidelines that came into effect in May, all social media platforms with a registered user base over 50 lakh, are mandated to file a monthly compliance report. Launched in 2020 by Radhakrishna and Mayank Bidawtka, Tiger Global backed Koo claims to have 60 lakh users on its platform.