As Pegasus Snoop Is Revealed, India Allows Surveillance State With No Legal Safeguards

20 Jul 2021 10 min read  Share

India’s largest power company wants to deploy facial-recognition systems to monitor thousands of workers: 64 such systems have been installed by 61 government authorities over the last three years in a country with no data-protection law, potentially violating fundamental rights to privacy and free expression and a Supreme Court judgement.


Bengaluru: As India’s government uses Israeli technology to illegally monitor cellphones, other modes of surveillance, such as facial-recognition systems, are spreading without a protective law and parliamentary oversight.


A move by India's largest electricity company to buy facial-recognition systems that can identify and monitor more than 60,000 people (the company has about 18,000 employees) at its facilities nationwide is the latest move at electronic surveillance by government-run organisations in violation of fundamental rights, a point previously made by the Supreme Court.


If fulfilled, two tenders issued by the ministry of power on 15 March and 28 May 2021 on behalf of the National Thermal Power Corporation (NTPC) to buy 13 facial-recognition systems could result in an illegal breach of the fundamental right to privacy and free speech, said experts.


Over the last three years, 64 facial-recognition systems costing around Rs 1,200 crore have been installed by 61 government authorities, such as the civil aviation ministry, home affairs, railways, various state-level police  and the Telangana State Election Commission, according to a facial-recognition-system tracker maintained by the Panoptic Project, which is run by the Internet Freedom Foundation (IFF), an advocacy group. 

The Constitution of India guarantees the right to privacy as part of the right to life and personal liberty under Article 21. The right to freedom of speech and expression is protected under Article 19 of the Constitution. 


In a February 2021 representation to a parliamentary standing committee on information technology, the IFF noted how facial-recognition can lead to a “chilling effect” on the right to free speech because people could be wary of being prosecuted if they expressed anti-government sentiment.


Many global companies use technology to track not just workers but critics. Amazon, for instance, has been accused of spying on labour and environment groups and has a patent on a wristband that monitors worker productivity. Facial-recognition systems exist in a “legal grey area” globally, said this April 2020 paper.  


In the US, for instance, a privacy law exists, but it applies only to data held by federal agencies, not private companies, such as Amazon. In 2019, San Francisco was the first US city to ban the use of facial recognition, although 59% of Americans support the use of such technology. 


Only three governments, Belgium, Morocco and Luxembourg, ban the use of facial-recognition technology, while 98 countries have adopted it.  In April 2021, the European Union’s privacy watchdog, the European Data Protection Supervisor, recommended a ban on facial-recognition technology because of its “deep and non-democratic intrusion” into private lives.


Cascading Effect Of NTPC’s Tenders


With an installed capacity of over 66,000 megawatts (MW), NTPC is India’s largest power company with 72 coal, gas, hydro, solar and wind power installations nationwide. If it deploys surveillance technology successfully, it could become a model for other companies, said experts. 


“Facial recognition technology is invasive. Being able to read a person’s face is intrusive, not just of privacy but the person,” said Usha Ramanathan, an independent law researcher who specialises on technology-related issues and opposes Aadhar, India’s national identification database. “Management wants to monitor and control labour. And now this. Does this mean that when labourers enter factories, they lose their fundamental rights?”


NTPC tenders for facial-recognition technology say it will be used to monitor attendance at company facilities in Ranchi, Jharkhand and Raigarh, Chhattisgarh. The tender for Raigarh seeks 10 such systems with a face-storage capacity of 3,000 each and the one for Ranchi seeks three systems with a face- storage capacity of 10,000 each.


“The moment you allow [facial recognition technology] in one place, it becomes a model to follow. After a big company like NTPC deploys this technology, other public sector units can follow. It has a cascading effect,” Ramanathan. “This is how [the usage of] fingerprint scanners biometric attendance started.” 


A power-sector official concurred with Ramanathan’s assessment that NTPC could be a model for other companies.


“NTPC has offices even in the remotest of locations. The usage of this technology can quickly become widespread,” said the power-sector official, speaking on condition of anonymity since he is not authorised to speak to the media. 


Article 14 sought comment on privacy and surveillance concerns from India’s minister for power Raj Kumar Singh, his secretary Alok Kumar and the NTPC. We will update this copy if they respond.  


The introduction of biometrics to monitor attendance in government offices using Aadhar in 2014 was the first officially sanctioned breach of privacy, according to some experts. 


“What we are seeing now is a move from biometrics to facial recognition,” said Srinivas Kodali, a researcher at the Free Software Movement of India, a national coalition of regional free-software movements. Kodali referred to the Aadhar-enabled attendance monitoring system run by the union government’s department of personnel and training


“It is a privacy issue because it can result in a constant monitoring of work environments,” said Kodali. “Employees should be concerned about how this data can be used to retain them in a job or fire them.” 


Apart from the uncertain legal basis for facial-recognition systems, their accuracy has often been called into question (here and here).


Inaccuracies And Misidentification


“The most basic problem is that the technology is not accurate,” said Anushka Jain, associate counsel (surveillance and transparency) at the IFF.


Facial-recognition technology uses algorithms—a set of rules that helps computers make decisions—to extract data points from a person’s face to create a digital signature of sorts. The computer then compares this signature with a database for verification. 


Facial-recognition systems have reported near-perfect identification, but this is only when all conditions are right.  


“...this degree of accuracy is only possible in ideal conditions where there is consistency in lighting and positioning, and where the facial features of the subjects are clear and unobscured,” said this 2020 blog at the Center for Strategic and International Studies. “In real world deployments, accuracy rates tend to be far lower.” 


Accuracy drops if subjects do not look directly at the camera or there is interference from background or shadow. A 2018 US study found the error rate of identifying women with darker skin tones increased by 34.7%; the corresponding error for lighter-skinned men was around 0.8%.


Sukhnidh Kaur, a digital literacy fellow at the IFF said facial-recognition inaccuracies could lead to “false positives and negatives”—identifying a subject as someone else or failing to identify a subject—errors from a racial and gender bias in criminal charges and lowered access to resources in cases where identity authentication is used. 


With police departments in India now using facial-recognition for verification and identification,  wrongful convictions are a concern, said Jain. “If Courts do accept this evidence, the FRT (facial-recognition technology) algorithm should also be made open to challenge in Court,” she said. 


In August 2018, Delhi police informed the Delhi high court that the accuracy of facial-recognition technology that they use to trace missing persons is around 2%. In August 2019, the ministry of women and child development, which uses facial recognition to trace missing children, reported an accuracy of 1%, with the algorithm even identifying boys as girls.


Even if accuracy and efficiency could be improved,  the use of facial-recognition technology still breaches fundamental rights. Indeed, if accuracy does improve, it will mean “mass surveillance by the state”, said Kaur. “Nothing good comes of this.” 


Failing The Supreme Court’s ‘Proportionality Test’


In 2019, the National Crime Records Bureau (NCRB) sought proposals to build a Rs 308-crore National Automated Facial Recognition System (AFRS), a database supposedly of criminals, but it can just as easily be used to identify dissenters.


The Delhi Police used an AFRS in advance of Prime Minister Narendra Modi’s 22 December 2019 rally to ensure “miscreants who could raise slogans or banners” were kept out. Police also used footage from various protest sites to filter out “law and order suspects” at the rally. 


Such profiling implies privacy breaches of citizens who may have no criminal records. 


To check whether an AFRS violates the right to privacy, the Supreme Court in 2018, in a case called Justice K S Puttaswamy (retd.) vs the Union of India referred to a “proportionality test”. 


AFRS, the Supreme Court said, does not fulfil “legitimate state aim”, since innocents can suffer because of false positives and negatives; it is disproportionate because it collects the information of everyone present at a crime scene without their consent; it is used without considering more efficient technologies; and it disproportionately impacts fundamental rights.


The Lack Of A Law


In a 2018 Supreme Court judgement that upheld Aadhar as constitutionally valid, the judges noted that efficiency could not be the only consideration for using biometric technology.


“The decision to link Aadhaar numbers with SIM cards and to require e- KYC (know your customer) authentication of mobile subscribers has been looked upon by the Union  government purely as a matter of efficiency of identification,” said the Supreme Court. “The mere existence of a legitimate state aim will not justify the means which are adopted. Ends do not justify means, at least as a matter of constitutional principle. For the means to be valid, they must be carefully tailored to achieve a legitimate state aim and should not be either disproportionate or excessive in their encroachment on individual liberties.”


In the case of NTPC, facial-recognition technology, according to the tender, is to be used to monitor attendance, but, as Ramanathan said, “the technology can be upgraded and expanded to other purposes”. 


Expanding on similar concerns, Jain said “all the facial biometric information of the people who work at NTPC will be recorded and stored but because we don’t have a data protection law in India, we don’t know how such [biometric] data could be used further”.


The Personal Data Protection Bill was introduced in parliament in 2019 but it remains unclear when it will be implemented. Currently, NTPC can share its data with any other person or organisation because there is no law to regulate what can and cannot be done with the data.


In any case, as experts have pointed out, the data-protection bill provides exceptions for governmental surveillance.


The bill also gives companies “complete control over processing employees’ data”, said Kodali, the Free Software Movement of India researcher. For example, companies monitor employee schedules by tracking work hours and internet browsing patterns. 


“The reason the exemption exists are companies like Uber, Ola and Swiggy, which monitor employee data because incentives are dependent on this data,” said Kodali. 


Unlike fingerprint scanners, people are not always aware their identities are captured on a facial-recognition-technology system nor do they always explicitly consent to it.


Uncertain, Opaque Details


The only law indirectly connected to oversight of facial-recognition technology is the Information Technology (IT) Act, 2000, which classifies biometric data as “sensitive personal data” and lays down various requirements for handling such data, but that applies only to private entities not government agencies. 


On 31 March and 19 May, Jain of the IFF filed two right-to-information (RTI) requests with NTPC, asking under what law the company was using facial recognition, its relevant policies, its accuracy and where the images would be stored. 


On 25 June, Jain received a response from NTPC in Chhattisgarh. “Not applicable” was the response to the law used. It said “recording of attendance is mandatory...due to COVID-19 pandemic biometric attendance captured through finger print is required to be discontinued and attendance capturing through face recognition machine has been introduced”. NTPC said accuracy was  "satisfactory" but provided no data. 


As for data storage, the NTPC claimed the information was “confidential” and, so, exempt under section 8(1) of the RTI Act


“First they said ‘use a password’. Then they said your fingerprint and voice can be your password. Now they’re saying your face is your password,” said Ramanathan. “These are ways of capturing [biometric] data and storing information that can be used for any purpose that they (State agencies) decide, even against you.” 


(Rishika Pardikar is a freelance reporter from Bengaluru. Arpit Goyal is a student of law at National Law University, Jodhpur and an intern at Article 14.)