With the National Crime Records Bureau’s (NCRB) call for bids for the creation of the National Automated Facial Recognition System (AFRS) closing on Thursday, digital rights advocacy group Internet Freedom Foundation has said that the way the system appears to being implemented, it may turn out to be faulty or make way for mass surveillance.
NCRB had first released the Request for Proposals (RFP) seeking bids for the AFRS on June 28, 2019. However, the deadline was extended multiple times due to reported administrative reasons. Finally, the RFP was recalled and cancelled. A new revised version of the RFP was issued on June 22 this year. The current deadline for the submission of bids is on October 8, and the estimated budget of the project is Rs 308 crore.
The AFRS will be used to create a national database of photographs. According to the RFP, this database is purported to be used to swiftly identify criminals by gathering existing data from various other databases like the passport database under the Ministry of External Affairs, the Crime and Criminal Tracking Network and Systems (CCTNS) by the NCRB under the Ministry of Home Affairs (MHA), the Interoperable Criminal Justice System (ICJS) by the NCRB, Women and Child Development Ministry's KhoyaPaya Portal, Automated Fingerprint Identification System (AFIS) by the NCRB, and any other image database available with the police or other entities.
To identify criminals, the scene of crime (SOC) images and videos will be matched with the above mentioned databases by using facial recognition technology. According to the Internet Freedom Foundation (IFF), “Face recognition systems use computer algorithms to pick out specific, distinctive details about a person’s face. These details, such as distance between the eyes or shape of the chin, are then converted into a mathematical representation and compared to data on other faces collected in a face recognition database. The data about a particular face is often called a face template and is distinct from a photograph because it’s designed to only include certain details that can be used to distinguish one face from another.”
According to IFF, “Claims relating to accuracy of Facial Recognition Technology (FRT) systems are routinely exaggerated and the real numbers leave much to be desired. The implementation of such faulty FRT systems would lead to high rates of false positives and false negatives in this recognition process.” The organisation further said in a statement, “The AFRS is being developed and deployed by the government without any technical standards in place which may lead to faulty systems being implemented. Once in place, it would be very difficult for it to be reconciled with future technical standards and damage like discrimination and exclusion would be impossible to undo.”
The revised RPF has no mention of the international standards, which were included in the original RFP, and had to be complied with as a technical requirement. The reason behind the exclusion of these standards is unclear and raises the question of why necessary standards of technical requirements are being diluted.
The implementation of an accurate FRT system is also not a solution as it would still violate fundamental rights of citizens by facilitating mass surveillance, the IFF said. The right to freedom of movement would be hampered as mass surveillance would allow the government to track the movements of individuals in real time across the country and the right to privacy will also be violated as the sensitive personal data, which is collected by these FRT systems, can be used by the government without the informed consent of the individuals. This would also prevent the individual from exercising the liberty to share their information in some contexts and remain anonymous in others according to their own choice, the IFF added.
Talking about the legality of the AFRS, IFF said, “There is no anchoring legislation which allows for and regulates the use of AFRS. In the absence of such a framework and safeguards, the first requirement for lawful restriction on the right to privacy is not met.”