Loading...

AI profiles predict crimes speedily but can discriminate against poorer sections of society

AI profiles predict crimes speedily but can discriminate against poorer sections of society
Photo Credit: 123RF.com
Loading...

On March 1, more than 40 civil organisations called on the Council of the European Union (EU), the European Parliament, and all EU member state governments to prohibit artificial intelligence (AI) predictive and profiling AI systems in law enforcement and criminal justice in the Artificial Intelligence Act (AIA).

They strongly believe that AI systems in law enforcement, especially the use of predictive and profiling AI systems, disproportionately target the most marginalised in society, infringe on liberty and fair trial rights, and reinforce structural discrimination.

“Age-old discrimination is being hard-wired into new age technologies in the form of predictive and profiling AI systems used by law enforcement and criminal justice authorities. Seeking to predict people’s future behaviour and punish them for it is completely incompatible with the fundamental right to be presumed innocent until proven guilty. The only way to protect people from these harms and other fundamental rights infringements is to prohibit their use.” said Griff Ferris, Legal and Policy Officer, Fair Trials in a statement.

Loading...

The use of AI in predictive policing essentially involves the use of advanced algorithms to analyze humungous amounts of data to draw insights and understand patterns that can help authorities predict and help prevent crimes. In the US, for instance, AI-powered predictive policing creates hotspots of areas where there is likely to be more crime and hence send more cops to those areas.

India is not far behind. In 2015, for instance, the Delhi Police and Indian Space Research Organisation–Advanced Data Processing Research Institute (ISRO-ADRIN) partnered to develop a Crime Mapping, Analytics and Predictive System (CMAPS) — a web-based application deployed in Delhi Police Headquarters and accessible via a browser from all police stations and districts of Delhi.

CMAPS generates crime-reporting queries and has the capacity to identify crime hotspots by auto sweep on the Dial 100 database every 1-3 minutes, replacing a Delhi Police crime-mapping tool that involved manual gathering of data every 15 days. It performs trend analysis, compiles crime and criminal profiles and analyses the behaviour of suspected offenders — all with accompanying graphics. CMAPs also has a security module for VIP threat rating, based on vulnerability of the potential target and the security deployed, and advanced predictive analysis, among other features.

Loading...

Likewise, the police in Himachal Pradesh have installed hundreds of CCTV cameras to form a 'CCTV Surveillance Matrix', and similar moves have been made by the state governments of Telangana and Jharkhand.

That said, the findings are not admissible in the courts — at least, not as yet. “CMAPS basically points the police to where crime is happening, etc. (But) I don’t think the CMAPS data itself can be used in court under the Evidence Act. It helps the police find crime, and then they’ll have to prove that the crime has been committed using traditional non-predictive methods,” explained Arindrajit Basu, Research Lead at the Centre for Internet Society.

Gurgaon-based Staqu, which built the Punjab Artificial Intelligence System (PAIS) system for the Punjab police, has applied for a tender by Lucknow Smart City to deploy a new voice-based surveillance system. For this, the company’s JARVIS AI system will use microphones attached to CCTV cameras to recognize specific sounds, like a gunshot or a scream, and contact the authorities automatically. The system uses convolutional neural networks (CNNs) to identify different sounds in a scene. CNNs are typically used for image and video recognition, but in this case, they’re being used to discern patterns in sounds.

Loading...

Automated systems like these make immense sense for a country like India that ranks 67th out of 135 countries on the Crime Index, according to the Numbeo Crime Index 2021. Moreover, given the lakhs of criminal offences in India, it's not practical to expect police officers to manually examine each case.

In countries with populations as large as India, surveillance technologies are often seen as the only way to achieve scale in police forces. Systems like these depend on the fact that a large number of criminals police identify on a regular basis are repeat offenders, said experts. The PAIS system, for instance, is able to flag an individual who has a criminal record in a matter of seconds, Atul Rai, the chief executive of Staqu, told Mint earlier.

Speaking to Mint on February 11, Pam Dixon, founder and executive director of the World Privacy Forum — a public interest research group, cautioned that "...these kinds of monitoring systems need to be transparent and should clearly say what words and sounds are being listened for. The policies for these systems need to be in place before they are built and used”.

Loading...

Fortunately, AI-powered systems are only available to the police as an investigative tool in India and not as proof of a crime. India is yet to get a data protection law, which is why privacy experts reiterate that prohibiting AI predictive and profiling AI systems in law enforcement and criminal justice makes sense. They add that governments will have to balance the needs for security of the state with regulations that prevent their misuse.

To be sure, experts pointed out that the absence of laws regulating such surveillance doesn’t necessarily mean they’re legal. Technically, legal jurisprudence says that the country must have a law that regulated technologies like facial recognition (FR) and more in order to apply them in citizen policing or otherwise.


Sign up for Newsletter

Select your Newsletter frequency