Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

SFLC.

IN
K-9, Second Floor, Birbal Road,
Jangpura Extension,
New Delhi-110014
(tel): +91-11-43587126
Email : mail@sflc.in
www.sflc.in

SFLC.in’s Comments on the NITI Aayog Discussion Paper: Responsible AI For


All:Use-Case Approach on Facial Recognition Technology

The Central Government think-tank, NITI Aayog has recently floated a Consultation
paper on Responsible Artificial Intelligence (RAI) titled ‘Discussion Paper:
Responsible AI for All: Adopting the Framework - A Use Case Approach on Facial
Recognition Technology’, in November, 20221 [hereinafter “Paper” or “Discussion
Paper”].

SFLC.in welcomes the initiative to develop discourse around Facial Recognition


Technologies and the possible impacts across various stakeholders.

Defining Facial Recognition Technology:


The Paper defines Facial Recognition Technology [“FRT”] as “an AI system that allows
identification or verification of a person based on certain images or video data
interfacing with the underlying algorithm.”2 The definition is used as an umbrella term
to classify forms of technologies that are “designed to identify or trace individuals
using visual images”. Facial Recognition Technology runs on algorithms that engage
in technologies such as, facial detection, facial recognition, and feature extraction.
They are based on algorithms which are trained to recognize faces. This is carried by

1
Responsible AI for All: Adopting the Framework – A Use Case Approach on Facial Recognition Technology, NITI
Aayog, November, 2022, available at:
https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&cad=rja&uact=8&ved=2ahUKEwjLotXtisT7
AhUB7TgGHUe1Au4QFnoECB8QAQ&url=https%3A%2F%2Fwww.niti.gov.in%2Fsites%2Fdefault%2Ffiles%2F202
2-11%2FAi_for_All_2022_02112022_0.pdf&usg=AOvVaw2Y7O8CPtzFRVrSeEdFB24I.
2
Smriti Parsheera, ‘Adoption and regulation of facial recognition technologies in India: Why and why not?’
(November 2019) Data Governance Network, Working Paper 05.
adding very large amounts of data through images from facial datasets.
The increasing prevalence of CCTVs in public areas, the rise of social media, and the
burgeoning presence of images on the internet have resulted in the formation of large
banks of information that can be subjected to FRT software.

The Paper enumerates the ‘Design Risks’ that are associated with FRT, and also
identifies the rights-based challenges that arise from its adoption.

Regulatory framework:
In the landmark decision of Justice K.S. Puttaswamy v Union of India,3 the Supreme
Court recognized the right to informational autonomy as a facet of the Right to Privacy
under Article 21. This forms the cornerstone of the regulation of FRT, as it protects
individuals’ data from being used for purposes they have not explicitly consented to.
As FRT is built on datasets containing large amounts of images sourced from CCTV’s
as well, they could hypothetically process images of people who wouldn’t have
consented to the same. Another concern from a data privacy perspective is the possible
use of data available on social media platforms for investigation purposes by law
enforcement agencies. In the absence of duly established rules, users can be subjected
to arbitrary and over-broad investigative procedure. The current Criminal Procedure
Code, 1973 makes no provision for investigative processes using data from online
platforms.

Lack of purpose limitation is another concern. This principle entails that data collected
is only utilized in the manner that was informed to the user in the first place. Another
element of privacy that needs to be protected is anonymity. FRT systems process large
amounts of sensitive personal data, and the more data they process, the larger is the
incentive to collect such data by entities. This reduces the amount of space that
individuals have to maintain anonymity, and in the larger context, privacy.

3
Justice K.S. Puttaswamy vs Union of India, (2017) 10 SCC 1.
In order to protect these freedoms, the Supreme Court laid down the ‘Proportionality
test’. This test has formed the baseline against which acts of private parties and
Government can be assessed as being over-invasive of privacy or within its contours.
FRT by its very nature constitutes an invasive technology. Persons usually may not
have the option of refusing consent in the absence of being provided an opportunity to
give their consent for such sensitive data processing. As a result, there arise several
harms which must be properly assessed before FRTs can be deployed en masse.

SFLC.in’s Recommendations:

At the outset, the efforts of the NITI Aayog in carrying out a comprehensive study of
facial recognition technology solutions as a part of the Responsible Artificial
Intelligence (RAI) series must be commended. SFLC.in hopes that this is the first in
the series of use-cases which NITI Aayog will study in more detail. The outcome must
be to align the deployment of AI systems along the principles of Responsible AI, which
have been laid out in the first two reports by the central think tank.

Importantly, another study on the use-cases of FRTs used by law enforcement in


criminal investigations will help shine light on the deployment of such technologies in
real-time. This would allow stakeholders to evaluate the harms and devise ways to
mitigate them by complying to the RAI principles, thereby securing the rights of the
accused and users, as established under law.

After considering all the details made available of Digi Yatra project, SFLC.in proposes
the following comments:

I. Throughout the study, and the recommendations in the context of Digi Yatra, the
form of consent considered acceptable is ‘explicit consent’. However, given the
sensitive nature of the personal data involved, it is important that the Data
Principals i.e., users are made fully aware of the processes and entities which
will be utilizing their personal data. In light of this, the form of consent which
should be taken from data principals upon availing the services of Digi Yatra
must be ‘informed explicit consent’. This would require the Data Fiduciary to
explain unambiguously to the Data Principal all the uses and vulnerabilities their
data may be exposed to in a manner understandable by the data principal, and
take explicit consent after such information has been communicated.

II. The recommendation that Standard Operating Procedures (SoPs) must clearly
detail the internal operation and handling of data collected and processed by the
entity is encouraged. SoPs must be examined independently for standards by
various stakeholders including legal and technical experts, and members of the
civil society. These SoPs must be made available in public domain, along with
the recommendations emerging from the independent examination, to ensure
actual transparency.

III. Although it has been stated that all collected data will be deleted within twenty-
four hours from the departure of the passenger's flight from the local airport’s
database, as noted in the Paper, no rules have been put in place for the deletion
of such data. In addition to NITI Aayog’s recommendation that such rules must
be formed clearly, it is recommended that a strict purpose limitation standard
must also be introduced. All details of the entities with which the data was shared
must be informed to the Data Principals.

IV. A robust data protection framework in the context of FRT based technologies
must be separate and independent from the general Data Protection statute. Over
the past five years, the country has seen four iterations of the law, with the latest
draft having been made public in November, 2022. This iteration, the Digital
Personal Data Protection Bill, suffers from significant defects. Most importantly,
the Bill fails to recognise that personal data and sensitive personal data warrant
separate types of protection. The Bills have also witnessed considerable
expansion by way of exemptions available to the Government, dilution of
informed consent, lack of data protection principles such as data minimization,
and ambiguity in the structure and constitution of the Data Protection Board of
India. Protecting the principles of Responsible AI as laid down by the NITI
Aayog will be difficult to achieve if legal reform for FRTs is kept contingent on
the data protection law in India. It is therefore recommended to enact a separate
and independent legislation for the Artificial Intelligence technologies,
including Facial Recognition Technology in the country.

V. The importance given to developers and vendors is an encouraging development.


Furthering NITI Aayog’s recommendations, it is recommended that all standards
of development and deployment of FRT technologies, must be independently
examined and made available in public domain in an easily accessible language.
This is in addition to ensuring that the system audits recommended are made
available in public domain as well.

VI. Adopting a ‘Policy first’ approach to deployment of new technologies is a


necessary element of any successful technological endeavour. Conducting
thorough impact assessments are necessary before AI can be deployed, even for
a particular sector. We recommend the commencement of standardized tests and
impact assessments which can be modeled as per the requirements of the
particular sector, industry or engagement, rather than one-size-fits-all approach.

Conclusion:
The efforts of the NITI Aayog in providing guidance on matters of deployment of AI
in different spheres is wholeheartedly welcomed and appreciated. It is hoped that
further research is carried out to explore how the RAI principles can be applied to
different types of AI technologies.
About SFLC.in
SFLC.in is a donor supported legal services organisation that brings together lawyers,
policy analysts, technologists, and students to protect freedom in the digital world.
SFLC.in promotes innovation and open access to knowledge by helping developers
make great Free and Open Source Software, protect privacy and civil liberties for
citizens in the digital world by educating and providing free legal advice and help
policy makers make informed and just decisions with the use and adoption of
technology.

Software Freedom Law Center, India SFLC.in has been granted Special Consultative
Status with the Economic and Social Council of the United Nations (ECOSOC).

You might also like