Homo Sapiens: Even Artificial, Our Intelligence Is Gender Normative

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Even artificial, our intelligence is gender normative, no?

-Kriti Khandelwal, Preeti Nataraj


Homo Sapiens are the most evolved species at the moment, holding powers of decisiveness, thought
and judgement; this judgement call, this sense of power is what we are trying to transmit and
programme into an alternate “reality”. At the dawn of a new technical era, it is a breakthrough for the
field which holds in its hands, the capability to perceive both, the tangible and intangible realities of
life. Artificial intelligence is in all essence, power.

Notoriously, history is said to repeat itself; Looking back at existing gender, racial and societal biases,
is this the alternate ‘reality’ that we are going to be programming as well? Two steps forward, one step
back is still one step forward; but is the step forward excluding a larger bandwidth of people? And,
looking at an existing data set from where this all begins, we see so called “bugs” that are informed
by the present-day scenario of discrimination which still persists across platforms. If the data set is
unable to sort through such disparities, will a language of binary codes be able to tackle the
complexities of gender bias?

According to UNESCO, global statistics of companies that develop this AI technology, have only 22%
of women working in the field out of which only 9% make it to leadership positions, this brings an
imbalance to the system. For example, on google translate, while translating from English to any
other language, a gaping anomaly sets things askew. Translations to any Farsi or Turkish language,
the terms are gender neutral and a statement like – ‘She is President; He is cooking’ gets translated to
‘This person is president; This person is cooking’. However, upon immediately reversing it back to the
English language, the automatic prompt translation is “He is President; She is cooking”. The reversal
of the same sentence, even with a language change does not read the initial prompt given but
reaches out to AI for a translation again, after which as per the data fed, these reversed gender roles
are attached.

If google gets it wrong, how does anyone else get it right?

This gender bias reaches far and gets in the way of real time job opportunities as well where AI is
programmed to favour men for a specific task that involves finances/healthcare/banking and pretty
much the whole spectrum wherein women are looked at for more ‘submissive’ job conditions. This is
very well reflected even with Amazon Alexa, Cortana and Siri where by default, voices of women are
programmed. This further solidifies the objectification of this technology and generates a
psychological assumption that it is programmed to follow orders; even with a failure in the system’s
software or hardware, a women’s voice embodies the face of failure.

While this focuses on gender bias in AI, pointing the finger at the maker, the creator is far more
important. Any form of bias in artificial intelligence stems from the existing cultural, social, racial
prejudices and blind spots of the minds behind them. AI is simply a means of translating the ideas
and ideals that the male, well paid, white computer scientists have and is abhorrently non inclusive.

Imagine a world where all toddlers are raised by 30-year-old males, this is what AI reflects at the
moment.

The world is taking a leap of faith with AI and we must regard its potential to bring about a positive
change with a new reality. This identification of its biases at a primary level is quintessential in its
growth towards an inclusive society. A springing point to finding the solutions to these can be
representation and inclusion of women in the work force and recognition of women as a consumer
base. Acceptance that the data set that exists has a discrepancy that runs wide and deep and
resolving this by re-analysing the data that is fed with groups that engage gender experts and
women’s organisations, will prove to be a definite step ahead.

Even with these existing disparities, AI is being widely deployed by tech-giants that use the interface
to promote and influence what society hears, sees and feels so this bias will only get further deep
rooted if it isn’t nipped in the bud, at least in this alternate reality that we are creating. Afterall, AI is
only as ‘woke’ as we are.
Bibliography- list of websites:

▪ https://www.lexalytics.com/lexablog/bias-in-ai-machine-
learning#:~:text=Societal%20AI%20bias%20occurs%20when,their%20output%20rein
forces%20societal%20biases.
▪ https://www.mckinsey.com/featured-insights/asia-pacific/a-conversation-on-
artificial-intelligence-and-gender-bias
▪ https://www.forbes.com/sites/carmenniethammer/2020/03/02/ai-bias-could-put-
womens-lives-at-riska-challenge-for-regulators/?sh=5f4d04e5534f
▪ https://aipolicylabs.org/ai-and-gender-equality-unclocking-gender-bias-in-
artificial-intelligence/
▪ https://www.openglobalrights.org/addressing-gender-bias-in-artificial-intelligence-
and-automation/
▪ https://www.brookings.edu/research/how-ai-bots-and-voice-assistants-reinforce-
gender-bias/

You might also like