Loading...

Social media governance needs tailor-made approach in India

Social media governance needs tailor-made approach in India

In an increasingly interconnected world driven by social platforms, the need for a cleaner, safer, and reliable online space cannot be overstated. Social media dominates lives and lifestyles in an unprecedented manner. From engaging with friends and family, capturing news and events, to expressing ourselves on multiple topics of interest, social media caters to unique requirements in today’s transformative world. 

The scale at which social media has been adopted by users globally is an interesting trend. As of January 2022, social media users worldwide stand at 4.62 billion, almost 58.4% of the overall global population! India itself boasts of over 448 million social media users, with people spending an average of 2.25 hours each day on various platforms. Platforms today are populated with exponential volumes of user generated content (UGC) every minute, and more interestingly, a splendid variety of UGC is in the form of videos, text, memes, GIFs, audio, etc.    

This explosion in content volumes from an ever-growing network of users, necessitates a framework of enhanced trust and safety, to enable and protect those who consume content.  Even though social platforms connect people and generate significant value, they can be misused through online abuse, bullying, misinformation, hate speech, profanity and obscenity that severely hampers trust and safety.   

To boost user confidence and strengthen online safety, content moderation - done proactively and in accordance with the law of the land - can benefit users and platforms alike.  

Tailor-made and Neutral 

Content moderation in India calls for a tailor-made approach and must recognize its distinct socio-cultural nuances. With 22 official languages and 6,000+ dialects, India is a macrocosm of linguistic pluralism. Standardized algorithms of global tech giants can be incompatible with local flavors, and hence a customized content moderation approach which mirrors Indian society is needed. 

Of late, India has seen the rise of indigenous platforms which enable local language expression online. And given the variety of languages in India, there is a requirement for moderation of content in Indic languages. 

The mechanism to moderate content needs to be neutral, unbiased, and should stand independent from the stated policies of a platform. India has a rich history of published judicial pronouncements that stands as the backbone of our legal system. This treasure can be leveraged to create India-specific content moderation policies, where user safety in the local context and the sovereignty of the country gets paramount importance.  

Synergy of Humans and Machines 

Once a structured framework is designed, platforms are best positioned to leverage both human talent and the power of AI/ML to execute the practice. Humans and machines each carry their own strengths and limitations.  

Skilled moderators, who keep the platform clean, are exposed to huge tracts of harmful content, which can impact them at an emotional and psychological level. This exposure can be curbed by using machines. However, machines cannot accurately decode certain region-specific or language-specific nuances that could carry different meanings in different contexts. For example, a Hindi word like ‘sala’ could stand for something derogatory, or could simply imply ‘brother-in-law’, depending on the context.  

Thus, excessive reliance on machines can lead to needless content deletions and takedowns, which can be appealed and reinstated by creators. A balance of human intervention and machines is the way forward, where each can learn and evolve in a collective effort to strengthen transparency.  

User Awareness  

Apart from content moderation, initiatives that enhance user awareness on safe practices and responsible behavior, are the need of the hour. Most users, especially on multilingual platforms, are experiencing social media for the first time. The English-centric philosophies of global platforms had limited interactions of native language speakers on social media. Platforms have a role to play in educating users on their responsibilities as stakeholders. 

For instance, community guidelines that list content which is permissible or prohibited online, should be freely available to users in multiple languages and should be country-specific. Similarly, a mechanism to reward users for flagging fake content, while penalizing those for falsely labeling content as ‘fake’, should be implemented. 

I believe content moderation, together with user education, can enhance social media ecosystem, spur innovation, and propel the connected world to scale to newer levels.  

Loading...
Rajneesh Jaswal

Rajneesh Jaswal


Rajneesh Jaswal heads the Legal and Policy teams, content moderation practice and compliance efforts at Koo India.


Sign up for Newsletter

Select your Newsletter frequency