Professional Documents
Culture Documents
STS_LESSON-6
STS_LESSON-6
STS_LESSON-6
Isaac Asimov
• introduced to the world of science fiction what are South Korea
known as the Three Laws of Robotics, which were o considered one of the most high-tech
published in his short story “Runaround” in 1942. countries in the world
• The laws Asimov formulated are: o leading the way in the development of the Robot
1. A robot may not injure a human being or, Ethics Charter
through inaction, allow a human being to come
to harm. Robot Ethics Charter
2. A robot must obey any orders given to it by o drawn up “to prevent human abuse of robots—
human beings, except where such orders and vice versa”.
would conflict with the First Law. o focuses on addressing social issues arising
3. A robot must protect its own existence as long from the widespread integration of robots into
as such protection does not conflict with the society, including
First or Second Law. ▪ defining proper human-robot
These three laws indicate that the robots uphold interactions
humans as superior since they have form. ▪ managing data collection and
While these laws are rooted in science fiction, the distribution by robots
current status of robotic technology requires a fresh ▪ Some see this charter as reminiscent
perspective on them. of Asimov's Three Laws of Robotics,
• As ideas from science fiction become reality, but others, like robot designer Mark
Tilden, believe it's premature to
Asimov's laws can now be applied practically.
imbue robots with morals.
• Initially, these laws appear to ensure the safe
advancement of this potential new life form. Despite differing views, the advancement of technology
o However, they assume that human life holds will likely prompt other countries to develop their own
greater value than that of the androids being codes of ethics for robots and human interactions.
created.
B. Human, Morals and Machines
If we believe that androids should be regarded as equal Technology is transforming our interactions with
to or even above humans, Asimov's laws may not nature.
apply. • We now have what's called "technological
o place androids in a subordinate position to nature," where various technologies either
humans. replicate, enhance, or simulate natural experiences.
If androids are considered equal to humans in terms of • Television channels like the Discovery Channel
their right to life, implementing Asimov's laws and Animal Planet offer digital portrayals of nature,
becomes problematic. showcasing animal behaviors and natural
• In a scenario involving both human and android phenomena.
soldiers, should an android always sacrifice • Video games like Zoo Tycoon allow children to
itself to save a human? Would humans be engage with virtual animal life. Even zoos are
willing to die for androids? integrating technology, using webcams to let
• Despite some embracing materialism and people observe animals remotely.
envisioning robots as potential equals to humans, • Robot pets, like Sony's AIBO, are popular items
it's doubtful that many would be willing to in stores.
sacrifice their lives for a robot. • Many people spend significant time in virtual
Although current robotics technology doesn't demand worlds like Second Life.
ethical codes for robots, some countries are taking Does it matter that we're substituting real nature
proactive steps towards developing a code of ethics for with technological alternatives?
them.
Robot Ethics Charter
To answer this, we look at evolutionary and cross- • According to academics, entrepreneurs, and
cultural perspectives on humans' relationship with businesses, these technologies have the potential
nature, as well as recent psychological research on the to significantly change society.
impact of technological nature. • While it's uncertain if these technologies will
achieve their ambitious goals, they will definitely
Legal Moves for Safety mix with influential demographic, economic,
Scientists are starting to seriously consider the and cultural factors, potentially altering daily
ethical challenges arising from advancements in life significantly.
robotics.
“Is Google Making Us Stupid?”
• In South Korea, experts are creating an ethical • article by Nicholas Carrs (2008)
code to prevent mistreatment of robots by humans • discusses how the Internet affects our ability to
and vice versa. focus, changes our knowledge, and increases
• The European Robotics Network (Euron) is our dependence on it
urging governments to establish laws, particularly • argues that spending time online has negative
focusing on safety as robots become more effects on our minds, making it difficult to
commonplace in everyday life. concentrate on long reading passages.
As robots become more intelligent, determining • Carr personally struggles with the negative effects,
who is responsible for any harm they cause feeling that his brain now expects information to
becomes tricky - whether it's the designer, the user, be delivered quickly, like the fast flow of the
or the robot itself. Internet.
• Ethical guidelines for robots can be based on • Carr suggests that our brains are adapting to the
utilitarian principles but may need adjustments rapid pace of online information consumption.
based on specific circumstances. • The article is divided into two main parts:
• For instance, medical professionals follow o Carr's desire for his brain to synchronize with
ethical codes tailored to patient needs, and similar the Internet
standards apply to other professions like law, o Google's perspective on replacing human
religion, and military service. brains with artificial intelligence
• Robots will adhere to the ethical rules we provide
them, making decisions based on moral codes and C. Why the Future Does Not Need Us?
values we set. • As technology advances, intelligent machines
• In emergency situations, autonomous vehicles capable of outperforming humans in various tasks
may prioritize saving more lives, even if it means are being developed.
sacrificing a few. o This could lead to all work being carried out by
highly organized systems of machines,
Ethical Issues rendering human effort unnecessary.
• There are special cases that will require • Either of two cases might occur:
modifications of the core rules that are based on the o The machines might be permitted to make all
circumstances of their use. For example: of their own decisions without human oversight.
✓ Doctors do not euthanize patients to spread o Human control over the machines might be
the wealth of their organs, even if it means that retained.
there is a net positive with regard to Machines > Humans
survivors. • If machines are granted full decision-making
-- They have to conform to a separate code of autonomy, the outcome is unpredictable since it's
ethics designed around the needs of patients impossible to foresee their behavior.
and their rights that restricts their actions. o However, it's evident that humanity's fate would
✓ Lawyers, religious leaders, and military then be dictated by these machines.
personnel who establish special relationships • While it might be argued that humans would never
with individuals who are protected by specific willingly surrender all power to machines, the
ethical codes reality could be different. Society might either
o willingly relinquish control to machines or
It's crucial for these systems to be able to explain their o find itself so reliant on them that it has no
moral decisions, just like humans would explain their practical choice but to accept their decisions.
actions.
Humans > Machines
• Human control over machines might persist
Today, advanced technologies like Artificial o average man: may have control over certain
Intelligence (AI), augmented reality, virtual reality, private machines of his own, such as his car or
home robots, and cloud computing are fascinating his personal computer
many people.
o tiny elite: control over large systems of Robot industries would compete fiercely for resources,
machines will be in the hands of the driving prices beyond human affordability and
• Ruthless elite: might opt to eliminate the majority potentially leading to the displacement of biological
of humanity; could employ methods such as humans from existence.
propaganda or biological means to decrease the A textbook on dystopia and Moravec
birth rate until humanity dwindles, leaving only the • discuss how our main job in the 21st century will be
elite o “ensuring continued cooperation from the
• Compassionate elite: may choose to shepherd the robot industries” by passing laws decreeing
rest of humanity, ensuring their basic needs are that they be “nice,” and
met, providing wholesome activities, and offering o describing how seriously dangerous a human
treatments for dissatisfied individuals. can be once transformed into an unbounded
o However, life in such a society would lack super intelligent robot.
purpose, potentially requiring biological or • Moravec’s view is that the robots will eventually
psychological alterations to eliminate the succeed us that humans clearly face extinction.
desire for power or redirect it into harmless • While we're accustomed to frequent scientific
pursuits. advancements, we haven't fully grasped the
o While engineered individuals in this society unique threats posed by 21st-century technologies
may be content, they would lack freedom and like
be reduced to the status of domesticated o robotics
animals. o genetic engineering
o nanotechnology
Theodore Kaczynski
• also known as the Unabomber Threats Brought by Scientific Breakthroughs
• an American domestic terrorist who killed three 1. Robots, engineered organisms, and nanobots share
people and injured many more in a bombing campaign a dangerous amplifying factor; they can self-replicate.
targeting individuals involved with modern technology. • While a bomb explodes only once, a single robot or
• While his actions were criminal and insane, his vision nanobot can reproduce rapidly and become
highlighted the unintended consequences often uncontrollable.
associated with technology. • This self-replication can occur through computer
networks, presenting risks like system failures or
❖ unintended consequences are often associated disruptions.
with technology – highlighted by Unabomber • Uncontrolled self-replication may lead to a
• Echoes Murphy's law – "Anything that can go GREATER RISK: potentially causing significant
wrong, will." For examples: physical damage.
o overuse of antibiotics → biggest such problem 2. With Robotics, genetic engineering and
so far: the emergence of antibiotic-resistant nanotechnology, a sequence of small, individually
and much more dangerous bacteria. sensible advances leads to an accumulation of great
o attempts to eliminate malarial mosquitoes power and, simultaneously, great danger.
using DDT (dichlorodiphenyltrichloroethane • Each of these technologies also offers untold
insecticide) caused them to acquire DDT promise: The vision of near immortality that
resistance; malarial parasites, likewise, Kurzweil sees in his robot dreams drives us
acquired multi-drug-resistant genes forward;
• The root cause of such surprises lies in the complexity o Genetic engineering may soon provide
of the systems involved, where changes can have treatments, if not outright cures, for most
cascading (may kasunod pa) effects that are hard to diseases; and
predict, especially when human actions are involved. o Nanotechnology and nanomedicine can
• Biological species often struggle to survive when address more ills.
faced with superior competitors, as seen in the o Together, they could significantly extend our
displacement of South American marsupials by average life span and improve the quality of
North American placental mammals millions of our lives.
years ago. • What was different in the 20th century? Certainly,
Marsupials – mammals comprising kangaroos and the technologies underlying the weapons of mass
related animals na hindi nakakapagdevelop ng true destruction (WMD)–nuclear, biological, and
placenta chemical (NBC)–were powerful, and the weapons
• In a completely free marketplace, advanced robots an enormous threat.
would likely impact humans similarly to how North • But building nuclear weapons required, at least for
American placental mammals affected South a time, access to both
American marsupials, or how humans have impacted o rare–indeed, effectively unavailable–raw
numerous species. materials and
o highly protected information
• biological and chemical weapons programs also constructive ones. It can be exploited by military or
tended to require large- scale activities. terrorist groups to create devastating devices that could
target specific areas or populations.
21st-century technologies–genetics, nanotechnology, J. Robert Oppenheimer
and robotics (GNR)
• so powerful that they can spawn whole new
classes of accidents and abuses.
• Most dangerously, for the first time, these accidents
and abuses are widely within the reach of
individuals or small groups.
o do not require large facilities or rare raw
materials. Knowledge alone will enable their
use; thus, we have the possibility not just of
weapons of mass destruction but of
knowledge-enabled mass destruction
(KMD), this destructiveness hugely amplified
by the power of self-replication.
3. Enormous computing power is combined with the
manipulative advances of the physical sciences and the • a renowned physicist
new, deep understandings in genetics, enormous • led the development of the atomic bomb out of
transformative power is being unleashed with the rapid concern for the threat posed by Nazi Germany
and radical progress in molecular electronics. obtaining such weapons.
• Molecular electronics – where individual atoms and • Despite concerns about potential risks such as an
molecules replace lithographically drawn transistors atomic explosion setting the atmosphere on fire
• We would be able to meet or exceed the Moore’s based on the calculation by Edward Teller,
law rate of progress for another 30 years. By 2030, scientists proceeded with the first atomic test,
we are likely to be able to build machines, in known as Trinity.
quantity, a million times as powerful as the • Following a successful test, atomic bombs were
personal computers of today. dropped on Hiroshima and Nagasaki.
4. We now know with certainty that the profound • While some scientists suggested demonstrating the
changes in the biological sciences are imminent and will bomb instead of using it in combat, the urgency to
challenge all our notions of what life is. end the war prevailed.
• Genetic engineering promises to • Despite the shock and horror following the
- revolutionize agriculture by increasing crop bombings, another bomb was dropped on
yields while reducing the use of pesticides Nagasaki.
- create tens of thousands of novel species of • In November 1945, Oppenheimer emphasized the
bacteria, plants, viruses, and animals responsibility of scientists to
- replace reproduction, or supplement it, with - use knowledge for the betterment of humanity
cloning and
- create cures for many diseases, increasing our - consider the consequences of their actions.
life span and our quality of life.
• [Genetic engineering] Human cloning: profound
OPPENHEIMER
ethical and moral issues
Oppenheimer stood firmly behind the scientific attitude,
- If we were to reengineer ourselves into several
saying, “It is not possible to be a scientist unless
separate and unequal species, then we would
you believe that the knowledge of the world, and the
threaten the notion of equality that is the very
power which this gives, is a thing which is of
cornerstone of our democracy.
intrinsic value to humanity, and that you are using
- The general public is aware of, and uneasy
it to help in the spread of knowledge and are willing
about, genetically modified foods, and
to take the consequences.”
seems to be rejecting the notion that such
foods should be permitted to be unlabeled. In our time, how much danger do we face not just
• [Genetic engineering] from nuclear weapons but from all these
- As the Lovins’ note, the USDA has already technologies? How high are the extinction risks?
approved about 50 genetically engineered
crops for unlimited release; more than half of Human Extinction vs Technologies
the world’s soybeans and a third of its corn now • Philosopher John Leslie concluded that the risk
contain genes spliced in from some other forms of human extinction is at least 30 percent.
of life. • Ray Kurzweil holds the view that there is a greater
5. Nanotechnology, like nuclear technology, is more than fifty percent likelihood of successfully
prone to misuse for destructive purposes rather than
navigating the future, although it should be noted “The exercise of vital powers along lines of excellence
that he has frequently been labeled as optimistic. in a life affording them scope”
Not only are these estimates not encouraging, but they - Greeks Definition -
do not include the probability of many horrid outcomes
that lie short of extinction. • To be truly happy in the future,
- have meaningful goals and challenges.
How To Be Saved from Human Extinction? - explore other ways to express our
creativity, rather than relying solely on
• Faced with such assessments, some serious
constant economic growth.
people are already suggesting that we simply
▪ While economic growth has benefited
move beyond the Earth as quickly as possible.
us for a long time, it hasn't made us
• We would colonize the galaxy using von
completely happy.
Neumann probes (self-replicating spacecrafts),
which hop from star system to star system,
Now, we have to decide if we want to keep pursuing
replicating as they go.
endless growth through technology, knowing the
• This step will almost certainly be necessary billion
risks involved.
years from now (or sooner if our solar system is
disastrously impacted by the impending collision of
our galaxy with the Andromeda galaxy within the
next three billion years),
o but if we take Kurzweil and Moravec at their
word, it might be necessary by the middle of
this century.
• Another idea involves constructing shields to
defend against dangerous technologies, akin to
the Strategic Defense Initiative proposed during
the Reagan administration.
o However, experts argue that such shields
would be ineffective and may have harmful
side effects.
o Therefore, limiting the pursuit of certain types
of knowledge emerges as a more realistic
alternative to mitigate the risks posed by
advanced technologies.
• We have been seeking knowledge since ancient
times. Aristotle opened his Metaphysics with the
simple statement: “All men by nature desire to
know.”
• Throughout history, the pursuit of knowledge has
been revered, but there are warnings, such as
Nietzsche's, about the potential dangers of
seeking truth at any cost.
-- Nietzsche: at the end of the 19th century, not
only that God is dead but that “faith in science,
which after all exists undeniably, cannot owe its
origin to a calculus of utility; it must have originated
in spite of the fact that the disutility and
dangerousness of the ‘will to truth,’ of ‘truth at any
price’ is proved to it constantly.”
▪ It is this further danger that we now fully face
the consequences of our truth-seeking. The
truth that science seeks can certainly be
considered a dangerous substitute for God
if it is likely to lead to our extinction. – just
like atomic bombs, nanotechnology, genetic
engineering (we study it for knowledge, but it
has devastating effects)
HAPPINESS