Professional Documents
Culture Documents
1 s2.0 0022519374901155 Main
1 s2.0 0022519374901155 Main
p(i) and ~0’) being the probabilities of xi and yi respectively, &Ii) being the
conditional probability of y, given some xi.
In other words, the transmitted information is equal to the information
content of y, minus the ambiguity due to the noise acting on the channel.
Now, if we look at the information content of the whole system S con-
taining the two structures x and y and the channel between them [Fig. l(b)],
we are dealing with a quantity obviously different from the information
transmitted from x to y. In fact, it is information transmitted in an implicit
channel from the whole system to the observer; the structure of the whole
system is an output of this channel, the input or source of which needs no
physical specification (see Dancoff & Quastler, 1965; Gatlin, 1966). It was
ON A FORMAL DEFINITION OF ORGANIZATION 297
shown (Atlan, 1968a) that the information content of S is equal to that of x,
i.e.
Nx) = - 7 P(i) log2 P(i),
plus the ambiguity H(ylx) :
m9 = fw + WY Ix>.
This is due to the meaning of this expression H(ylx) which is called the
ambiguity: it is the uncertainty left on y when x is specified, or, in other
words, the information content of y when y is viewed independently from X.
This is why, in the first case, where we consider information transmitted
from x to y, this expression is a loss of information and it is normal that it
(b)
FIG. 1. Opposite signs for the ambiguity H6.j~) when computed (a) in the information
transmitted from x to y; (b) in the information content of a whole system S containing
x and y.
equations
dp(jl
-- i) = -Jp(jli)
dt
+ $ J (1)
j
(where iVj is the number of different letters rj and Jan appropriate function),
differential equations can be written to express the ambiguity as a function
of a dose of errors accumulated in time (Atlan, 1968a,b, 1970). Thus,
according to the above-mentioned observation, two possible effects of noise
acting in two opposite directions can be considered, and correspond to two
kinds of ambiguities, one counted negatively and the other positively. The
first one is related to a “destructive ambiguity” and has the classical meaning
of a disorganizing effect. The second one is related to what we called an
“autonomy producing ambiguity”, because it acts by increasing the relative
autonomy of one part versus the other, or, in other words, by decreasing
the overall redundancy of the system.
This observation was used (Atlan, 196&z) to explain the apparently para-
doxical stimulating effect of low doses of noise-producing factors like ionizing
radiations, thermal noise, and time, on a system containing a channel
between two structures y1 and y,. The equation relating the information
content of the system to the dose of noise was written as
H(S) = H(Y,)-H(Y,IX)+H(YzlY,), (2)
where .X is the unspecified source in the channel to the observer, which can
be identified with the state of the system at the instant of time t-dt imme-
diately preceding the observation. By introducing equation (1) into (2) it
was possible to express H as a function of time.
As a result it was shown that a necessary condition for the existence of
an initial period of increase of H in time was a change in alphabet with
increase in the number of letters when going from y1 toy,. Thus the change
in alphabet with increase in number of letters which appears when one goes
from the four bases in nucleic acids to the twenty amino acids in proteins
was interpreted as a necessary condition for the positive effect of noise-
producing factors, to be able to overcompensate-at least up to a certain
point-their negative effect.
As was noted already this analysis was valid only for a system containing
not only one channel like y1 + y, but a very large number of structures
and channels between them, because otherwise there would be no sense in
viewing the information content of S differently from the information
transmitted in the channel y1 + y2. Therefore this analysis was somehow
artificial since it dealt with the effects of noise not on the real information
content of the system, but on the contribution of some particular channel
artificially isolated in the system. In particular the question of what happens
ON A FORMAL DEFINITION OF ORGANIZATION 299
when, due to an infinite accumulation of errors, y, eventually becomes
completely independent of yl, was left open until a more general treatment
can be proposed; and we were forced to use two different equations (Fig. 2
in Atlan, 196&z), namely equation (2) for the period of possible self-organ-
ization, i.e. increase of H, and a more classical equation previously used in
a theory of ageing (Atlan, 1968b), where the term corresponding to the
autonomy-producing ambiguity was deleted, for the subsequent period of
disorganization.
In the present note an expression of these two different effects in one
single equation is proposed to serve as the basis of a formal quantitative
definition of organization.
We start from Shannon’s definition of redundancy R, by H = H,,,,,(l -R),
where H is the information content of a message or system with internal
constraints between the parts, and H,,, is the maximum information content
computed by not taking into account the constraints, i.e. by assuming
complete independence of the parts. Differentiating H versus time-again
with the assumption that time means accumulated effects of noise-producing
factors-we get
‘; = (1 -R) ‘+ + II,,,,, @.
The two terms on the right-hand side can be identified with the two effects
of noise previously described. The first term has the meaning of a destructive
ambiguity which destroys H,,,, i.e. the total information transmitted to the
observer, counted without taking into account the constraints. It is the
classical disorganizing effect of noise. But, due to decreases in the con-
straints, dRfdt is negative, and the second term has the meaning of an
autonomy producing ambiguity which, as explained above, produces a
decrease in the redundancy of the system. In other words, as the accumulation
of errors acts by decreasing both H,,,, and R, the first term is negative and
the second positive.
Now, dR/dt and dH,,,,/dt are themselves two different functions of time,
fi and fi, which express the kinetics of the effects of noise on the system.
It is proposed that these two functions, together with equation (3), will
express the overall organization of the system, both structural and functional.
Different situations can occur. In some cases these functions can be such that
the variation of H in time, given by integration of this equation presents
an increasing phase due to the second term, i.e. the decrease in redundancy;
then H reaches a maximum and decreases when there is no redundancy left
to decrease from, and the predominant effect is due to the decrease of
dH,,Jdt. This case corresponds to self-organization since it shows an increase
300 H. ATLAN
FIG. 2. Variation of information content versus time with an initial increase figuring
“self-organization”.
ON A FORMAL DEFINITION OF ORGANIZATION 301
More generally, the different features of our definition of organization
can be summarized as follows: (i) the process of organization of any system
is described by the variation of information content with time, N(t), which
itself is the combination of two related functions: the decrease of redundancy,
and the decrease of maximum information content. (ii) The kind or state
of organization of a system is defined by three quantities, namely H,,,,,,, the
initial maximum information content when redundancy is ignored, R,, the
initial redundancy, and tM, which has the meaning of a factor of reliability.
These quantities are characteristic parameters of the function H(t). Thus,
relations of order between different organizations-which would enable us
to ascribe a higher or lower “degree” of organization to a system-can be
defined only in a vectorial way since organization cannot be represented
by a single number, or a point on an oriented axis. Its mathematical image
must be a vector of at least three components, or a point in a three-dimensional
space. Among these three parameters-or components--H,,,,, is a measure
of the structural aspect of organization, while R, and tM measure its func-
tional features, with a kind of intermediate role for R, which has also a
structural character.
According to the different possible values for these last two para-
meters, R,, and tM, we can distinguish different kinds of organization,
which can be shaped by different curves, representing the variation of
information content versus time, as in Fig. 3. (i) Whether or not the
organization can exhibit self-organizing properties, depends on the value
of the initial redundancy R,-,. Since these properties are a consequence
of a decrease in R, a minimum value of R. from which this decrease can
start is necessary. When R, is below the minimum and there is no self-
organization, the H(t) function is monotonously decreasing and the rate of
decrease depends upon a reliability factor similar to tM. When the system is
self-organizing as represented in all the curves in Fig. 3, the value R, above
this minimum will determine the relative height of the maximum H,: if R, is
high, HO is low and its total possible increase up to & is large, and the con-
trary if R, is low (compare Fig. 3(a) and (b) which represent these two possi-
bilities). Therefore, the value of R, above its necessary minimum for self-
organization can be used to measure a potential for self-organization. (ii)
However, this parameter, R,, is not enough. Not all the systems having a high
initial redundancy, i.e. a high potential for self-organization, will have in fact
the same kind of self-organizing properties. The value of the reliability, i.e. the
inertia opposed to random perturbations, will determine differences between
different kinds of functional organizations. A very high reliability will
correspond to a very long phase of growth or increase in H(t); on the
contrary, a small reliability will determine a very short duration for this
302 H. ATLAN
phase, so that H(t) will reach its maximum very fast, and then will decrease
as in any organized non-self-organizing system. Different combinations are
possible: high or low R, with high or low reliability, that is high or low
potentiaZ and long or short duration for self-organization [Fig. 3(a), (b),
63, WI.
The curve of Fig. 3(c) represents a particular case where R, is very high
but the reliability is very small, so that HM is reached very quickly and the
self-organizing properties disappear very fast. As a consequence, the system
does not seem to be self-organizing, even though its initial redundancy,
namely its potential for self-organization, was high. Thus, there are two
possible reasons for organized systems not to be self-organizing: either their
redundancy is too small and random noise-producing factors can only
destroy them-faster or slower according to their reliability-or, in spite
of an initial redundancy large enough, their reliability is not high enough
to allow for self-organizing processes to take place during an appreciable
period of time. In nature, crystals can be viewed as organized systems of
this type: because of their repetitive structure, their redundancy is very
large, but their reliability is low. This is manifested in their need for a
temperature low enough to maintain their structure. The potential barrier
ON A FORMAL DEFINITION OF ORGANIZATION 303
of their bonds must be sufficient to protect the crystal structure from the
effects of thermal noise. In other words, the crystalline structure can be
maintained only if the “dose of errors” is relatively low. If this dose-i.e.
the temperature-increases, it produces a very fast change in structure-
the resistance to random changes is very low-and a phase transition occurs,
at the end of which there is no more crystalline structure. On the contrary,
other repetitive structures-more flexible but less repetitive-like macro-
molecular systems can maintain themselves at a relatively high temperature,
that is in spite of a high frequency of random collisions strong enough to
modify their tertiary and quaternary structures. Thus a relatively high
“error” rate is present, but these “errors” do not destroy the whole system
at once. On the contrary, they decrease the redundancy and thus increase
the variety, i.e. the information content of the system; in contradistinction
to the situation in crystals, the phase transition here is not an all-or-none
process. Thus, we can see how production of information under the effects
of noise-producing factors is not as paradoxical and mysterious as it seems.
It is a consequence of “error” accumulation in a repetitive system, which in
addition is functionally reliable enough not to be destroyed at once by a
relatively low number of errors.
In other words, according to this view, the initial phase of increase in
complexity, which is a characteristic feature of growth and maturation of
living organisms, could exist in principle in any structure with a minimum
of redundancy, but would not be observed since it would be much too
short in simple systems which are only redundant. A certain degree of
functional complexity-reliability-is required to make this phase easily
observable. What is characteristic of living organisms is the existence of
different parameters allowing this phase to be more extended and distin-
guishable. But the principles of organization are always the same: provided
that its initial redundancy and functional reliability are large enough to
allow for an observable period of self-organization, a system can react to
random environmental stresses by an increase in complexity and variety,
so that it appears to adapt itself to its-even new-environment. Later on,
when the redundancy has been exhausted, the same random environmental
stresses are responsible for ageing and death.
All this can be used to give a more precise and quantitative formulation
to what Von Foerster (1960) called once an order from noise principle,
underlying such phenomena as adaptation, non-directed learning and
assimilation which, according to Piaget (1968), are all various facets of what
is called organization. As discussed elsewhere (Atlan, 1972) it can be applied
also to the understanding of the logics of evolution in which this principle
is obviously working.
304 H. ATLAN
REFERENCES
ASHBY, W. R. (1958). Cybernetica 1, 83.
ASHBY, W. R. (1962). In Principles of self organization (H. von Foerster & G. W. Zopf,
eds) pp. 255-278. New York and London: Pergamon Press.
ASHBY, W. R. (1967). In Currents in modern biology, Amsterdam: pp. 95-104. North-
Holland Publ. Co.
ATLAN, H. (1968~). J. theor. Biol. 21, 45.
ATLAN, H. (1970). Ann. phys. Biol. Med. 1, 15.
ATLAN, H. (19686). J. Geront. 23, 196.
ATLAN, H. (1972). L’Organisation biologique et la thkorie a’e l’information. Paris: Hermann.
DANCOFF, S. M. & QUASTLER, H. (1953). In Information theory in Biology (H. Quastler, ed).
pp. 263-273. Urbana: University of Illinois Press.
EIGEN, M. (1971). Naturwissenschaften 58, 465.
GATLIN, L. L. (1966). J. theor. Biol. 10, 281.
LINSCHITZ, H. (1953). In Information theory in Biology (H. Quastler, ed.) pp. 251-262.
Urbana: University of Illinois Press.
PIAGET, J. (1968). Lu naissance de 1 ‘intelligence chez l’enfant, 6th edn Suisse: Delachaux &
Niestle, Neuchatel.
ROTHSTEM, J. (1962). Philosophy Sci. 29, 406.
SHANNON, C. E. &WEAVER, W. (1949). The mathematical theory of communication. Urbana:
University of Illinois Press.
THEODORIDIS, G. C. & STARK, L. (1969). Nature, Land. 224, 860.
THEODORIDIS, G. C. & STARK, L. (1971). J. theor. Biol. 31, 377.
VON FOERSTER,H. (1960). In Se[forgunizing systems (M. C. Yovits & S. Cameron, eds).
pp. 31-50. New York and London: Pergamon Press.
VON NEUMANN, J. (1966). Theory of Self Reproducing Automata (A. W. Burks, ed.).
Urbana: University of Illinois Press.
YOCKEY, H. P. (1958). In Symposium on information theory in biology (H. P. Yockey, R. L.
Platzmann & H. Quastler, eds) p. 59 and 297. New York: Pergamon Press.