Download as pdf or txt
Download as pdf or txt
You are on page 1of 3

1.

2 Understanding Complex Information-Processing Systems 25

Representation and Hardware


less efficient but
another, or another may be slightly Computational theory algorithm implementation
more efhCient than in the data on
is, less sensitive to slight inaccuracies
more robust (that parallel and another,
Or again, one algorithm may be ‘‘1at is the goal of the How can this computa- How can the represen
hich it must run). hardware or machinery tation and algorithm be
then, may depend on the type of computation, why is it tional theory be imple-
serial. The choice, mented? In particular, realized physically?
is to be embodied physically appropriate, and what
in which the algorithm that of the device in which the process is the logic of the strat- what is the representa
This brings us to the third level, once again, egv by which it can be lion for the input and
The important point here is that,
is to be realized physically. technologies . carried out? output, and what is the
be implemented in quite different
the same algorithm may left, carrying algorithm for the trans
adds two numbers from right to
The child who methodically is imple formation?
be using the same algorithm that
a digit when necessary may the neighbor
of the cash register in
mented by the wires and transistors quite
physical realization of the algorithm is Figure 1—4. The three levels at which any machine carrying out an information-
hood supermarket, but the written
example: Many people have processing task must be understood.
different in these two cases. Another
and there is a more or less standard
computer programs to play tic-tac-toe,
has in fact been implemented
algorithm that cannot lose. This algorithm com
in a quite different technology in a
by W. D. Hillis and B. Silverman whole algorithm to be used to transform one into the other. And at the other
wooden building set. The
puter made out of Tinkertoys, a children’s extreme are the details of how the algorithm and representation are real
actually works, currently resides in a
monstrously ungainly engine, which ized phvsicallv—the detailed computer architecture, so to speak. These
in St. Louis.
museum at the University of Missouri three levels are coupled, but only loosely The choice of an algorithm is
some physical substrates better than
Some styles of algorithm will Suit influenced for example, by what it has to do and by the hardware in which
digital computers, the number of
others. For example, in conventional it must run. But there is a wide choice available at each level, and the
number of gates, while in a brain, the
connections is comparable to the explication of each level involves issues that are rather independent of the
X iOi than the number of nerve
number of connections is much larger (
wires are rather cheap in biological other two.
cells. The underlying reason is that Each of the three levels of description will have its place in the eventual
and in three dimensions.
architecture, because they can grow individually understanding of perptual information processing, and of course they
is more or less restricted to two
In conventional technology wire laying are logically and causally related. But an important point to note is that
the scope for using parallel
dimensions which quite severely restricts since the three levels are only rather loosely related, some phenomena
operations are often better carried
techniques and algorithms; the same may be explained at only one or two of them. This means, for example,
out serially that a correct explanation of some psychophysical observation must he
formulated at the appropriate level. In attempts to relate psychophysical
problems to physiology too often there is confusion about the level at
The Three Levels
which problems should be addressed. For instance, some are related
something like the manner shown in mainly to the physical mechanisms of vision—such as aftrimages (fur
We can summarize our discussion in
levels at which an information- example, the one you see after staring at a light bulb) or such as the fact
Figure 1—4, which illustrates the different
before one can be said to have that any color can be matched by a suitable mixture of the three primaries
processing device must be understood
the top level, is the abstract (a consequence principally of the fact that we humans have three types of
understood it completely At one extreme,
which the performance of the device cones). On the other hand, the ambiguity of the Necker cube (Figure 1—5)
computational theory of the device, in
one kind of information to another, the seems to demand a different kind of explanation. To be sure, part of the
is characterized as a mapping from
defined precisely and its appro explanation of its perceptual reversal must have to do with a histable neural
abstract properties of this mapping are
at hand are demonstrated. In the network (that is, one with two distinct stable states) somewhere inside the
priateness and adequacy for the task
for the input anti output and the
center is the choice of representation
26 The Philosopbi’ and the Approach 1.2 Understanding Complex Jnfonnation.Processing
Si’stems 2

Psychophysics can also help to determine the nature of a represen


tation. The work of Roger Shepard (1975), Eleanor Rosch (1978),
or Eliz
abeth Warrington (1975) provides some interesting hints in
this direction.
More specifically Stevens (1979) argued from psychophysical
experi
ments that surface orientation is represented by the coordinates
of slant
and tilt, rather than (for example) the more traditional p,
q) of gradient
space (see Chapter 3). He also deduced from the uniformity
of the size of
errors made by subjects judging surface orientation over a wide
(al (b) (c) range of
orientations that the representational quantities used for slant and tilt
are
Figure 1—5. The so-called Necker illusion, named after L. A. Necker, the Swiss pure angles and not, for example, their cosines, sines, or tangents.
naturalist who developed it in 1832. The essence of the matter is that the two- More generally if the idea that different phenomena
need to be
dimensional representation (a) has collapsed the depth out of a cube and that a cxplained at different levels is kept clearly in mind, it often helps
in the
certain aspect of human vision is to recover this missing third dimension. The assessment of the validity of the different kinds of objections that are
raised
depth of the cube can indeed be perceived, but two interpretations are possible, rom time to time. For example, one favorite is that the brain
(h) and (c). A person’s perception characteristically flips from one to the other. is quite
uiffercnt from a computer because one is parallel and the other serial.
The
answer to this, of course, is that the distinction between serial and
parallel
is a distinction at the level of algorithm; it is not fundamental
at all—
anything programmed in parallel can be rewritten serially
(though not
brain, but few would feel satisfied by an account that failed to mention the necessarily vice versa). The distinction, therefore, provides no grounds
for
existence of two different but perfectly plausible three-dimensional imer arguing that the brain operates so differently from a computer that
a com
pretations of this two-dimensional image. puter could not be programmed to perform the same tasks.
For some phenomena, the type of explanation required is fairly
obvious. Neuroanatomy for example, is clearly tied principally to the third
level, the physical realization of the computation. The same holds for syn Importance of Computational Theory
aptic mechanisms, action potentials, inhibitory interactions, and so forth.
Neurophvsiology too, is related mostly to this level, but it can also help us Although algorithms and mechanisms are empirically more
accessible, it
to understand the type of representations being used, particularly if one is the top level, the le’el of computational theory which is
critically impor
accepts something along the lines of Barlow’s views that I quoted earlier, tarn from an information-processing point of view The
reason for this is
But one has to exercise extreme caution in making inferences from neu that the nature of the computations that underlie perception
depends more
rophvsiological findings about the algorithms and representations being upon the computational problems that have to be solved
than upon the
used. particularly until one has a clear idea about what information needs particular hardware in which their solutions are implemented,
To phrase
to be represented and what processes need to be implemented. the matter another way an algorithm is likely to be
understood more
Psychophysics, on the other hand, is relatd more directly to the level readily by understanding the nature of the problem being
solved than by
of algorithm and representation. Different algorithms tend to fail in radi examining the mechanism (and the hardware) in which it is
embodied.
cally different ways as they are pushed to the limits of their performance In a similar vein, trying to understand perception by
studying only
or are deprived of critical information. As we shall see, primarily psycho neurons is like trying to understand bird flight by studying
only feathers’
physical evidence proved to Poggio and myself that our first stereo-match It )ust cannot be done. In order to understand
bird flight, we have to
ing algorithm (Marr and Poggio. 19”6) was not the one that is used by the undet stand aerodynamics; only then do the structure of
feathers and the
brain, and the best evidence that our second algorithm (Marr and Poggio, different shapes of birds’ wings make sense. More to
the point, as we shall
1979) is roughly the one that is used also comes from psychophysics. Of see, we cannot understand why retinal ganglion
cells and lateral geniculate
course, the underlying computational theory remained th same in both neurons have the receptive fields they do just by
studying their anatomy
cases, only the algorithms were different. and physiofo’ We can understand how
these cells and neurons heha’e
1 2 t nd’rta dzr comp1x Info-matio- -Processing 5ysterns 29
8 The Philosophy and the Approach

(ho zisk’ found It cven appe.rs that the emerging “trace” theon of gram
hut in order to under
as they do by studying their ‘ring and interactIons, mar (Chomsky and Lasnik 19 ‘7) may provide a way of synthesizing the
they are circularly
stand why the receptive fields are as they are—why two approaches—showing that for example, some of the rathcr ad hoc
regions have charac
s mrnetrical and why their excitatory and inhibitory i esti ictions that form part of the computational theory may he c )nse
little of the theo of
teristic shapes and distributions—We have to know a quences of ‘‘eakneses in the computational power that is availahia for
mathematics of the
differential operators hand-pass channels, and the implementing ssntactical decoding.
uncertainty principle (see Chapter 2
empirical disci
Perhaps it is not surprising that the ver” specialized
the absence of com
plines of the neuroscienceS failed to appreciate frilly The Approach of J. j. Gibson
of approach did not
putational theory; but it is surprising that this level
of artificial intelligence.
play a more forceful role in the early development 10 perception, perhaps the nearest anyone came to the level of computa
heuristic program for carrying Out some task was held
For far too long, a tonal theory was Gibson (1966). However, although some aspects of his
between what a program did
to he a theory of that task, and the distinction thinking were on the right lines, he did not understand properly what
(1) a sle of expla
and how it did it was not taken seriously As a result, information processing was, which led him to seriously underestimate the
mechanisms to solve partic
nation evolved that invoked the use of special complexity of the information-processing problems involved in vision and
as the lists of attribute
ular problems (2) particular data structures, such the consequent subtlety that is necessary in approaching them.
programing language. were
value pairs called property lists in the LISP Gibson’s important contribution was to take the debate away from the
of knowledge. and (3)
held to amount to theories of the representation philosophical considerations of sense-data and the affective qualities of
a program would deal
there was frequently no way to determine whether senation and to note instead that the important thing about the senses is
program.
with a particular case other than by running the that they are channels for perception of the real world outside or, in the
Failure to recognize this theoretical distinction between what and bou’
case of vision, of the visible surfaces. He therefore asked the critically
of artificial intel
also greatly hampered communication between the fields important question. How does one obtain constant perceptions in even-day
transformational gram
ligence and linguistics. Chomsky’s (19fi5) theory of life ‘n the basis of continually changing sensations? This is exactly the right
earlier. It is con
mar is a true computational theory in the sense defined question, showing that Gibson correctly regarded the problem of percep
decomposition of an
cerned solely with specifying what the syntactic non as that of recovering from sensory information ‘valid” properties of
decomposition
English sentence should be, and not at all with how that the external world. His problem was that he had a much oversimplified
about this—it is
should he achieved. Chomsky himself was very clear view of how this should be done. His approach led him to consider higher-
though his
roughly his distinction between competence and performance, order variables—stimulus energy ratios, proportions, and so on—as
in midutter
idea of performance did include other factors, like stopping ‘°invariants” of the movement of an observer and of changes in stimulation
which
ance—but the fact that his theory was defined by transformations, intensity
people. Winograd
look like computations, seems to have confused many These invariants,” he wrote, “correspond to permanent properties of
the grounds
(1972), for example, felt able to criticize Chomsky’s theory on the ervironment. They constitute, therefore, information about the per
on a computer; I
that it cannot be inverted and so cannot be made to run manent environment.” This led him to a view in which the function of the
colleagues
had heard reflections of the same argument made by Chomsky’s brain was to “detect invariants” despite changes in “sensations” of light,
grammatical structure
in linguistics as they turn their attention to how pressure, or loudness of sound. Thus, he says that the “function of the
might actually he computed from a real English sentence. brain when looped with its perceptual organs, is not to decode signals,
Chomsky’s
The explanation is simply that finding algorithms by which nor to interpret messages, nor to accept images, nor to organie the sen
endeavor from for
theory may he implemented is a completely different sort’ input or to process the data, in modern terminology It is to seek and
different level, and
mulating the theory itself. In our terms, it is a study at a extract information about the environment from the flowing array of
Marcus (1980),
both tasks have to be done. This point was appreciated by ambient energy” and he thought of the nervous system as in some way
can be realized
who was concerned precisely with how Chomsky’s theory “resonating” to these invariants. He then embarked on a broad study of
grammatical
and with the kinds of constraints on the power of the human animals in their environments, looking for invariants to which they might
in syntax that
processor that might give rise to the structural constraints

You might also like