Loading...

Structure of social media grievance committee key to effectivity, say experts

Structure of social media grievance committee key to effectivity, say experts
Photo Credit: Pixabay
Loading...

The Indian government’s plan to set up a government appointed committee, which can overrule content takedown, blocking or removal decisions made by social media platforms, could lead to selective moderation on platforms and curb dissent, tech policy experts warned.

The committee, called the Grievance Appellate Committee (GAC), was proposed in the amendments to the country’s Information Technology (Intermediary Guidelines and Digital Media Ethics) Rules (IT Rules, 2021), on June 6. The government released a draft of the amendments for public consultation earlier this month and is expected to finalize them next month.

The objective of the grievance committee is to allow users to challenge decisions by social media companies to de-platform them or take down their posts. Currently, they can only appeal to the platforms or take matters to court. 

Loading...

Experts also warned that such a committee could lead to a substantial increase in such review requests and increase the burden for moderation. They also said that such a committee, if appointed, will need representatives from the platforms themselves and not only government officials. The draft rule says members of the committee will be appointed by the government.

“It cannot be a committee that only has representatives from big tech platforms or only has government-appointed officials. Both are problematic. It has to be a solution that has adequate representation. It also cannot be an administrative process,” said Isha Suri, senior researcher at the Centre for Internet and Society (CIS).

Akash Karmakar, partner at the Law Offices of Panag & Babu, said that the idea of setting up a committee would curb the “skewed power that social media intermediaries have”, but could “open the floodgates for abuse by selectively curbing dissent and gagging criticism against the government.”

Loading...

“The current composition of the committee does not ensure this so its independence is questionable,” he said.

Platforms like Twitter had to face flak when taking down posts from politicians, including those from the ruling party. Facebook, too, was accused of favouring the ruling party when it came to content moderation decisions.

Karmakar said that the committee has a better shot at being non-partisan if it’s appointed by an intermediary, and has independent members with no conflicts of interest.

Loading...

Further, Trishee Goyal, a research fellow at the Vidhi Centre for Legal Policy, pointed out that since a person will be able to approach this committee for any action taken by platforms, it could lead to the number of complaints going up significantly. She said that clarity will be needed on what caseload the committee can take on.

To be sure, Facebook’s Oversight Board is already an example of a committee that functions outside the platform’s regular moderation mechanisms. The Oversight Board takes on cases that have a broad impact on the platform instead of taking isolated incidents. A committee by the government could follow a similar rule.

That said, the biggest positive impact of the GAC could be in bringing algorithmic transparency. Platforms like Facebook, Twitter and more have often been accused of bias because they’re unable to explain decisions by their moderation algorithms, which are based on artificial intelligence (AI) technology. Experts said that the GAC could overturn such decisions, which in turn will allow platforms to bring better sync between laws and their own policies.

Loading...

“The decisions of the grievance committee will considerably widen the reference library of the social media intermediaries in understanding what content is unlawful and what is not,” said Goyal. Experts also said that this could be especially important for creators, who often have content taken down algorithmically for copyright infringement, even when they haven’t broken laws.

For instance, social media algorithms often take down remixed songs and videos under copyright rules, even though remixes are allowed by law. CIS’ Suri said that “allowing people to review the process these companies follow for content moderation is bound to be more effective than having a set of human beings overwhelmed with a backlog of cases.”


Sign up for Newsletter

Select your Newsletter frequency