India : In the Indian city, Lucknow, police have identified nearly 200 harassment hotspots that women visit often and from where they receive most number of complaints. They are now planning to set up facial recognition technology in these spots to monitor women’s expressions, which will hopefully enable them to prevent street harassment.
Digital rights experts have warned that this will lead to intrusive policing and privacy violations, on Friday.
“We will set up five AI-based cameras which will be capable of sending an alert to the nearest police station,” said police commissioner D.K. Thakur, referring to the artificial intelligence-based technology
“These cameras will become active as soon as the expressions of a woman in distress change,” he told reporters this week, without giving further details on which expressions would trigger an alert.
With plans for nationwide systems to modernise the police force and its information gathering and criminal identification processes, increasing number of facial recognition technology is being deployed in airports, railway stations and cafes across India.
Technology analysts and privacy experts are skeptical though. They say the benefits are not clear and could breach people’s privacy or lead to greater surveillance, since there is very little clarity on how the technology works, how the data is stored, and who can access the data.
“The whole idea that cameras are going to monitor women’s expressions to see if they are in distress is absurd,” said Anushka Jain, an associate counsel at digital rights non-profit Internet Freedom Foundation.
“What is the expression of someone in distress – is it fear, is it anger? I could be talking to my mother on the phone and get angry and make a face – will that trigger an alert and will they send a policeman?”
A more feasible solution would be to increase police patrol numbers, Jain commented, adding that the technology is untested, and could lead to over-policing and the harassment of women who trigger alerts.
Uttar Pradesh, where Lucknow is located, according to data is the least safe state, with the highest number of reported crimes against women in 2019.
Police often turn away women who go to register complaints or fail to take action, said Roop Rekha Verma, a women’s rights activist in Lucknow.
“And they want us to believe they will take action watching our facial expressions,” she said.
Though there is an increasing amount of backlash against facial recognition technology in the United States and in Europe, Indian officials have said it is needed to boost a severely under-policed country, and to stop criminal activity and find missing children.
But digital rights activists point out that without data protection law, facial recognition technology is problematic, moreover it threatens the right to privacy, which was declared to be a fundamental right by the Supreme Court in a landmark ruling in 2017.
“The police are using the technology to solve a problem without considering that this will simply become a new form of surveillance, a new form of exercising power over women,” said Vidushi Marda, a researcher at human rights group Article 19.
“AI is not a silver bullet, and no amount of ‘fancy’ tech can fix societal problems,” she said.