Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

Abbey Gioulos

Business
Mr. Grossi & Mr. Cheung
Monday, October, 26, 2020

Artificial Intelligence, Death, and Mourning;


Should we develop Grief Bots?

The world is evolving everyday through Artificial intelligence. Creating new inventions to
further our use of A.I. today. Usually, these inventions help us. However, they aren't always
perceived as a positive change. A brand new innovation being developed is called a “Greif Bot”.
As I researched, I discovered that a “Grief Bot” is a robot programmed to help you with the loss
of a loved one. They are robots the size of you and me. Though the purpose of the Bot is clear,
the way it operates is not. Does a “Grief Bot” act as the person who has passed, or act as a
grief counsellor during this challenging time? The age of Artificial Intelligence (A.I.) is well
underway, but are the right changes being made? I have to decide whether or not this supports
the common good and what the ethical consequences are. Before I make my decision, allow me
to share my thoughts on how this might affect the stakeholders.

The first stakeholder is the person who has passed away. One way the Bot can be
programmed is for the re-creation of the person who has died. If the Bot is intended to re-create
the deceased person, how does a manufacturer obtain consent? Without this consent, the
manufacturers cannot develop the Bots for this purpose. This means that before the person dies
they must consent. The person who has died in my opinion becomes the most important
stakeholder for the following reasons. First, the deceased person must consent. Secondly, the
deceased must decide if this is a good thing to leave their loved ones with the “Grief Bot”.
Lastly, the deceased would have had to decide if they are comfortable with a Bot being them
after they pass. A problem with having a Bot that is designed to be someone who has passed is
that the grieving person does not learn to cope with the loss adequately. When someone has
died we can no longer be with them. This is one of the hardest things to accept about grief.
Having a robot designed to make you think the opposite will not benefit your ability to move on
from the loss; it is not a healthy coping mechanism.

The second stakeholder is the person who is grieving. Another way the Bot can be made
is by being used as a grief counsellor. Essentially replacing therapists. If you have ever lost a
loved one, you understand how hard it can be. This pain can be helped with therapy and a good
support system. We gather our friends and family and mourn with them during this time. There
is no certainty that a robot will be able to provide the same comfort another human would.
Everyone needs a human connection from time-to-time. That’s why people choose to go see a
grief counsellor or attend meetings with other people. The lack of human connection from a Bot,
could potentially only make someone feel more alone. Especially now that their loved one is
gone. Grief is something we all will likely go through in our lives, and there is no easy solution.
The last stakeholder is the Bot manufacturers. Due to grief being an extremely sensitive
subject for most, I think it can be dangerous for companies to put themselves in the position of
creating new ways for A.I. to help you grieve while profiting off your losses. Investing and
promoting in something like this could destroy the company’s reputation, resulting in people
boycotting the business. A Bot could cost the manufacturer their clientele because customers
wouldn't want to contribute to the making and manufacturing of something they don’t support.
Making robots would have to be very expensive, which means some people in need of help will
not be able to afford it. Human communication is free and proven to be effective. Customers will
eventually realize that the only reason that “Grief Bots” are being made is so corporations can
profit off of people's losses. These Bots could potentially take away jobs from other people.
Which would cause outrage among people working in this field.

After thinking about all the stakeholders and consequences related to “Grief Bots”, I
have decided. I do not think that “Grief Bots” support the common good, or are an ethically
correct invention for many reasons. Some would include that the rights of the person that has
passed can be disrespected, and they do not have the ability to defend themselves. The Bots
can also create unhealthy coping mechanisms for people who are grieving. The Bots can also
have a lack of connection amongst people grieving. The making of the Bots can be expensive
and would not be accessible to everyone. Lastly, the Bots could potentially take away jobs from
therapists and counsellors. This invention overall negatively impacts the stakeholders. When
someone passes away, they are gone. Unfortunately, as sad as it is, that's how things are and I
don't believe it’s right to try and change that. When someone dies we surround ourselves with
people for a reason. A robot would not provide you with the comfort and support you need to get
through that extremely tough time. I know the age of A.I. is well underway, but I do not think this
is the right innovation being made.

The jamboard I chose to comment on for my communication section of the assignment


was Saige Poliwodas. When I came across the jambords for analyzing, hers caught my eye.
She wrote about “China’s Social Credit Score”, and I was baffled. I had no idea something like
this had existed in real life. China has a system which creates a social standing based on your
credit score. This means your credit score is available to everyone. In Canada your credit score
is kept private, and only shared when necessary. The way the system is set can be extremely
toxic and threatening to people with low scores. Saige explains the ways this can personally
affect someone, the social effects, and the ethical/legal effects. She also includes many factors
and important perspectives in the situation. Overall she explained it very well, the only thing I
would say is to expand on the personal consequences more in family life.

You might also like