Loading...

Facebook will stop using facial recognition, but Meta won’t

Facebook will stop using facial recognition, but Meta won’t
5 Nov, 2021

Facebook is almost fully abandoning facial recognition, but its parent company Meta isn’t. On November 2, the world’s largest social media network said it’s going to stop using facial recognition technology (FRT) systems on its platform and delete facial recognition templates for billions of people. However, Meta spokesperson Jason Grosse told Recode that the move doesn’t apply to its upcoming metaverse products.

The social media firm rebranded to Meta on October 29 when chief executive Mark Zuckerberg announced that the company is shifting its focus to building a future metaverse. “The next platform will be even more immersive — an embodied internet where you’re in the experience, not just looking at it. We call this the metaverse, and it will touch every product we build,” Zuckerberg said in a letter following the Facebook Connect event.

According to the November 2 announcement, Facebook will stop using the algorithms which allow the platform to automatically recognize when people appear in photos and suggest tags. This same algorithm, which is called DeepFace, will still be used for metaverse products. At Facebook Connect, the company showed examples of how it plans to build digital avatars of users which will eventually be as realistic as possible.

As in Facebook’s announcement last week, Grosse told Recode that the company believes in FRT’s usefulness and will continue to explore how it can be used. “For any potential future applications of technologies like this, we’ll continue to be public about intended use, how people can have control over these systems and their personal data, and how we’re living up to our responsible innovation framework,” he added.

The social media company isn’t the first to rethink its strategies with facial recognition. E-commerce giant Amazon and IBM have taken similar stances. 

Amazon had announced a one-year moratorium on police use of its FRT systems, called Rekognition, last year. It extended this moratorium indefinitely in May this year. IBM, on the other hand, had abandoned the facial recognition business altogether. 

“IBM no longer offers general purpose IBM facial recognition or analysis software. IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values and Principles of Trust and Transparency,” Arvind Krishna, chief executive of IBM, wrote in a letter to the US Congress on June 8, 2020.

It’s also worth noting that Meta’s back-and-forth on FRT isn’t one-of-a-kind either. While Amazon said that it will stop selling Rekognition to police forces, its statement didn’t clarify whether this applied to all branches of government, and the company continues selling the platform to private companies as well.

Many experts, researchers and academics have said that facial recognition systems have built-in racial biases and can be inaccurate often. A 2019 study by the National Institute of Standards and Technology (NIST), which found that some algorithms were up to a 100 times better at identifying white faces over coloured ones, is often cited as proof of that bias. 

Both Amazon and IBM’s announcements last year came after the death of George Floyd, an African-American man raised questions about police reforms and racial bias.

FRT has also been used in India. Police forces, including the Punjab Police, Delhi Police and many more use FRT. In 2019, the Delhi Police used FRT to identify “habitual protestors” and “rabble-rounders and miscreants” when widespread protests broke out in the country’s capital against the government’s Citizenship Amendment Act (CAA). 

In March 2020, Minister of Home Affairs Amit Shah said, during a Parliament address, that the government had identified 1100 people using FRT during the Delhi riots against the CAA.