CSC 410 Day 22 Unintentional Power

You might also like

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 15

CSC 410 Day 22

Unintentional Power
in the design of computing systems

1
The Power of Computing Systems.
Computing systems magnify human power:
◦ 1) Speed of operation;
◦ 2) Computational power (MIPS).
◦ 3) Automated control of complex systems (e.g.
computerized auto assembly, air-traffic control).
◦ 4) Rapid response to dynamic systems (e.g.
automated drug administration with continuous
monitoring of vital signs; shifting to different
power sources during times of high demand e.g.
high use of A/C in summer).

2
Intentional Power.
This refers to those powers of a computing
system it was designed to have:
◦ How much data can be processed / stored?
◦ What devices can it interface with?
◦ What is the intended use of the system?
Usually, computing systems are well-tested
within the range of their intended use.
Problems:
◦ The world is complex;
◦ Computing systems can be used in unforeseen
ways.
3
Unintentional Power.
The system has the power to produce
unintended results.
Real world usage of a system may involve:
◦ Unanticipated types and ranges of data;
◦ Interfacing the system with novel devices and other computing
systems;
◦ Unanticipated real-world events affecting the rate of traffic,
amount of data;
◦ Adaptation of the system to uses it was never designed for.
How much responsibility for unintended results
do designers of a computing system bear?

4
Example 1:
Therac 25 Radiation Therapy Machine.
Software controlled the safety inter-locks
A safety target used to protect patients from
overdose was not placed quickly enough.
Some patients were severely harmed or died.
Problems:
◦ Inadequate testing of how the software would control
the hardware emitting the radiation;
◦ Poor error messages that could not distinguish trivial
from (medically) severe problems;
◦ Inadequate communication between IT and medical
personnel.

5
Need for realistic testing.
A program may seem to work in a virtual
environment where I/O is just numbers.

If it is to be embedded in a more complex


physical system with real-world variables
(radiation, medicine, airplanes, etc.), it needs to
be tested under more realistic conditions.
◦ How does it interface with other machines, sensors?
◦ How does it interact with other technology?
◦ What is realistic peak demand like, and can the system
handle it?
6
Not just a technical problem.
“I’m just the tech guy; what do I know?”

When an IT system controls potentially


dangerous physical systems, IT professionals
should:
◦ Consult with qualified professionals who
understand these systems;

◦ Consider wider human and social implications and


values, not just how “good” the solution is from an
IT point of view.
7
What is a good IT system?

1)Technically good (efficient, well-


documented, easy to upgrade, etc.).

2) Good for the people / society affected by the


IT system in its real-world use.

1) and 2) are not the same: a technically good


solution might still do harm!

8
Example 2:
Gender Bias in Computer Programs?

9
Does the typical design of computer programs for
children deter females from computer careers?

1) Are typical computer games male-oriented?

2) Do you think there are reasons fewer women


than men pursue IT as a vocation?

3) Is this something the design of computer


systems could change?

10
What Creates the Problem of
Unintentional Power?
1) Distance:
◦ Designers are often removed from the actual
circumstances of use of their system (this may be in
a different country).
2) Adaptability:
◦ Computer systems can be adapted and re-used for
different purposes.

3) Limitations of human foresight.


◦ We aren’t good at predicting the impact of future
developments on an IT system.
11
The Hard Problem.
Since we can’t foresee all future uses of a
system, it is unreasonable to demand that we
plan for all of them.
So what can we do to try to avert disaster?
◦ Design the system so that any problems are
restricted to a limited domain.
◦ Take a broad view of the effects that matter,
especially on human beings and develop
precautions / fail-safes (are you really sure you
want to do this? Abort now?)
◦ Do realistic testing and improve the system in light
of reported problems in real-world use.
12
IT, vocation and duties.
Unintentional power shows that the vocation of
IT professional has wide social implications.

There is no infallible technique for preventing


all harm, but there is a duty to the neighbor to:
◦ Do the best we can, with the resources we have, to
anticipate potential problems, devise
precautionary measures and revise the system
when it fails for unforeseen reasons.

13
What if?
We need to use our imagination to consider less
obvious possibilities.

2 factors:
◦ 1) How likely are these possibilities?
◦ 2) How severe are the consequences if they occur
and we haven’t planned for them?

Ifthe consequences are bad enough, even an


unlikely scenario needs to be planned for.

14
The Precautionary Principle.
Inenvironmental ethics and drug testing, some
appeal to the “Precautionary Principle”:

◦ If it is reasonable to think an action A could be


seriously harmful, then we should “err on the side
of caution,” and not do A until we have good
evidence that A is safe.

This applies to computerization of “risky”


activities, such as healthcare, chemical plants,
nuclear power stations, etc.
15

You might also like