In 2019, the Delhi City government installed CCTV cameras in around 700 schools. Now, the government has introduced facial recognition cameras – a measure that raises severe privacy concerns for the students involved.
The trial currently underway in a dozen Delhi schools is likely to become more widespread as time goes on. So why is it problematic and what could the long-term repercussions be?
India is a country already leaning heavily on technology to police its population. The Aadhaar card database is one of the world's largest biometric identity systems. It has lodged the fingerprints and iris scans of around 1.3 billion people in its database, and it is used to identify people when they open a bank account, seek new employment, claim state benefits, apply for insurance, buy a house, and do a variety of other everyday tasks.
Since the Aadhaar database was introduced twelve years ago, it has been rolled out to around 90% of the population. Despite the initial push back, and the privacy concerns surrounding the creation of a massive biometric repository, most people with an Aadhaar card now consider it benign. Most citizens believe the system is secure, and that it is designed for their benefit.
This widespread acceptance of biometric tracking is troubling because India still lacks data protection laws that defend personal data and information. This should cause apprehension because people's data is exploitable and can be tracked – an apprehension that increases when children's data is involved.
Studies have shown that CCTV in schools can cause some benefits. Schools that implement CCTV have recorded up to 70% reduction in bullying. Teachers have been suspended for abusive behavior that was caught on camera, and there is evidence that CCTV can reduce physical attacks on both students and teachers.
If CCTV already achieves the necessary reduction of intolerable behaviors, however, it is important to question the benefit of adding facial recognition. As yet, this question has not been addressed by the authorities properly, leading to concerns over the potential to enable monitoring not on school premises, but elsewhere.
This makes sense when viewed in context with what is happening elsewhere in the country, where surveillance tech is becoming endemic, and raises important questions surrounding the political will to desensitize children to surveillance during their formative years. Consider recent proposals in Lucknow, for example, where police have outlined plans to use facial recognition to scan women's facial expressions for signs of harassment as they move around public spaces. This is a massive overreach that has resulted in a backlash from concerned women's rights groups and human rights advocates.
Furthermore, experts have expressed concerns about how authorities might choose to use facial recognition in classrooms. The technology could be used to monitor for facial expressions that signal a child isn't paying attention or is having comprehension difficulties, for example.
This could lead to individual students being singled out, excluded, disciplined, and potentially discriminated against. This raises valid questions about who will get to make decisions using the technology and how.
If facial scanning is used in this way, it has the potential to affect a young person's psychological state during their formative years. It also raises important considerations regarding how the information is stored and is made accessible, and how it might affect a student's future opportunities for higher education.
A political tool
The ramifications of creating a surveillance nexus that starts in schools could result in meaningful social and behavioral changes.
People who know they are being watched behave differently and are more likely to self-censor. They are also less likely to associate with others they perceive to be potentially damaging to their reputation, for example.
As a result, citizens placed under constant surveillance are likely to experience increased social fragmentation – with exclusionary behaviors, discrimination, and prejudice all likely to escalate. This potential increases if social credit systems like those in China are also introduced.
Democracy also stands to take a hit, because self-censorship leads to a heightened inability to oppose entrenched political systems. This creates the perfect breeding ground for authoritarianism and a social paradigm in which technology has silenced those who would usually opt to protest.
Even without considering these profound social ramifications, the security concerns surrounding the tracking of children rings alarm bells. The Aadhaar card database has already suffered significant breaches, which reveals the government's inability to secure sensitive personal information.
This is in part due to India's lack of data protection regulations, which allows the government to leverage data too broadly – resulting in an increased attack surface and an elevated threat of data leaks, data breaches, and data mishandling or misuse. This becomes all the more troubling when the data involved could be used by criminals to identify and track children.
The important thing to remember is that facial recognition algorithms work by creating a mathematical representation of a data subject's face. That biometric information can potentially be used to spoof someone's identity into a system, which could cause risks for children in the future – if their data is stolen by criminals who attempt to impersonate them electronically.
Concerns also exist due to facial recognition algorithms' lack of competence when it comes to identifying children. Rights activists are worried that this could result in children being misidentified and accused of crimes they never committed – potentially leading to abusive forms of discipline.
With so many risks involved, citizens should oppose the decision to install facial recognition in schools. The Indian government has failed to provide transparency over how the data will be collected, categorized, and stored – and it has revealed itself unable to promise the level of data security necessary.
Add to this the concerns over how children may be stigmatized, and how the technology could augment the school-to-prison pipeline, and we must conclude that the government has rushed into its decision – to satisfy its own ends – without duly considering the negative social ramifications.