Meta brings initiative to help women remove non-consensual images; expands Safety Hub

 Meta brings initiative to help women remove non-consensual images; expands Safety Hub
Photo Credit: Pixabay
3 Dec, 2021

Meta (formerly Facebook) has onboarded itself on a women safety initiative that seeks to flag and automatically remove non-consensual intimate images (NCII) from the platform.  

Karuna Nain, director of global safety policy at Meta, announced that the platform is now a part of StopNCII.org, an international channel operated by the UK-based Revenge Porn Helpline. 

The tool allows women to create a case based on images that they feel violate their privacy, and were published without their consent.  

Once a case is created, StopNCII.org generates anonymised hashes, or a specific digital identifier, based on the image that is being flagged by the user. 

Based on this hashed digital data, the tool scans for matches across a partner platform (of which Meta is now one), and if a match is found, automatically removes them from public access. StopNCII.org claims that it has a 90 percent removal rate so far based on cases generated on it by women, and have removed over 2 lakh individual NCIIs since 2015. 

Nain further said that to promote the venture to add to women safety on the platform, the company has partnered with organisations such as Social Media Matters, Centre for Social Research and the Red Dot Foundation from India.  

The initiative comes after Meta on November 10, 2021, announced that it identified about 15 pieces of content in every 10,000 related to bullying and harassment on the platform.  

It also claimed to have ‘proactively’ removed almost 60 percent of such content. 

Meta also expanded its Women Safety Hub in India to now become available in 12 Indian languages. The move will seek to offer non-English speaking women on platforms such as Facebook and Instagram to seek resources of support, by searching in local languages – which may be more familiar. 

Also read: What's so unique about Meta? Mixing reality with an Alphabet soup 

Meta also announced Bishakha Datta, executive editor of Point of View and Jyoti Vadehra, head of communications at the Centre for Social Research, as the first Indian members of its Global Women’s Safety Expert Advisors Group. The latter is a 14-member panel engaged in creating policies and products to offer better support to women.