Hindutva and Facial Recognition Software

For several months, police forces in India have been trialling facial recognition technology on protesters and citizens across the country. The earliest use of such technology in recent months goes back to February, when the CAA sparked a wave of unrest across parts of India.

In New Delhi and in Uttar Pradesh – as well as several other northern states – protests have swelled since the beginning of the year over the CAA. Those protests have taken various forms, including riots that turned violent after aggression from police and private security forces. 

Modhi’s Hindutva agenda – which has only intensified since he was reëlected last May – has caused many rights groups and activists around the country to fear for the safety of minorities in India. Those fears were proven to be right when the CAA was passed, and the RSS started to pick up steam and continued to gain funding from abroad. Even a visit from Trump – which should have brought with it increased scrutiny over domestic affairs from an international media that often has very little interest in the complexities of Indian politics – only served to bolster the nationalist and quasi-fascist claims of the RSS. 

The RSS’ chokehold on freedom of expression in India is cause for concern for protesters who have opposed the violent crackdown on dissent in India. In February, the Indian Express revealed that facial recognition technology had been trialled by various police forces on protesters around the country.

Automated Facial Recognition Systems (AFRS) have become an increasing flash point for discussions around civil liberties and freedom of expression around the world, and India is no exception. It’s a controversial system that works in conjunction with artificial intelligence to map the landmarks of the face, and compare those landmarks with those already on the system. Often, databases from police forces, voter registration and even on the internet have been used to train facial recognition algorithms which are in use by private companies and by state governments.

In east Delhi, the supporters of the BJP instigated violence and turned a peaceful protest into a riot on the 23rd February, which led to the deaths of fifty three people, among thousands more injured, and four mosques burned down. Soon after, the Home Minister Amit Shah said that the police had identified over 1000 people (with only 25 computers) using facial recognition software, which had never been disclosed publicly before. These people were labelled as perpetrators of violence and instigators of the riots – none of whom were BJP supporters. A few days later, the Indian government announced its plans to use the world’s largest facial recognition system in coordination with police forces as a way of further suppressing dissent.

Speaking to the press, Shah had said that the footage used to track down individuals combined CCTV with driving licenses and voter ID details. He also went on to say that “the software does not recognise religion”.

Speaking to The Hindu, a police officer said that many of the people who they had identified in Uttar Pradesh already had criminal records, and that the use of this facial recognition software only served as further confirmation that they were responsible for the violence there. 

But there are precious few details about the facial recognition companies which are supplying this nationwide system, and there is little to no prospect of accountability.  Current facial recognition capabilities are often nowhere near as efficient or accurate as they should be, which is particularly concerning when the consequences of error could involve a lengthy jail sentence or worse. 

In other countries, law enforcement using facial recognition technology on crowds have not had a high success rate. In 2013, the FBI’s facial recognition system did not locate the Tsarnaev brothers who were responsible for the 2013 Boston Marathon bombings, despite the fact that they both already had criminal records. Within the facial recognition sector, researchers and academics have also long spoken about the fact that machine learning algorithms tend to discriminate against people with darker skin and throw up false positives often. 

In India, while privacy is considered a ‘fundamental right’ by the Supreme Court, protection around personal data and digital privacy is still sparse. One of the concerns with the use of this facial recognition system is that protesters – and indeed anyone who police forces capture as a result of deploying the system – will have no idea how their information is being processed, and no way of appealing if there is a false positive. 

India has also served as a testing ground for facial recognition by police forces in other ways, such as in well publicised cases where it was used to reunite missing children with their families. These cases were covered by international media, and the Indian government made several assurances that facial recognition systems would only be used for specific reasons. But these have been contravened by the events of February this year.

Soon after the announcement of the rollout of India’s facial recognition system, the coronavirus pandemic was starting to take hold across India and the rest of the world. The scrutiny over India’s facial recognition system which may have taken place otherwise did not, and it remains still unclear how many stakeholders there are in the implementation of this system across the country. In mid-March, the Indian government went ahead and approved the use of AFRS, with the expectation that it will be used to enforce the operations of law enforcement across the country. 

Speaking to Reuters, a protester in Delhi said that she started to wear masks and cover her face every time she went to a protest. As masks became more prevalent around the world, facial recognition systems may become obsolete or end up being nowhere near as useful as governments and operators thought they were. But the prospect of a society where AFRS is rolled out is one that will further marginalize the most vulnerable in society - in India, it will contribute to violence against Muslims and those who dare to protest in the coming months.

Sanjana Varghese is a journalist and researcher based in London.

122 views0 comments