Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

J. theor. Biol.

(1974) 45, 295-304

On a Formal Definition of Organization


I-I. ATLAN

Polymer Department, Weizmann Institute of Science,


Rehovot, Israeli

(Received 14 July 1972, and in revised form 17 September 1973)

A mathematic detinition is proposed to account for the intuitive features


of what is usually meant by organization. To account for both functional
and structural aspects of organization the rate at which information
content of a system changes in time is examined. It can be shown that
Shannon’s expression for ambiguity in a channel has two different mean-
ings according to whether one is interested in the information transmitted
in the channel or in the information transmitted to the observer from a
whole system in which the channel is a part of a redundant communication
network. This was applied in a previous work to show that the effects of
noise on the information content H of a system result in two kinds of
ambiguities, “autonomy-producing” and “destructive” leading to increase
and decrease in H, respectively. By making use of this observation and
Shannon’s detiition of redundancy R, a single equation for dH/dt is
proposed to define organization on the basis of a kinetics of change of
information content of a system under the effects of environmental noise-
producing factors accumulated in time. It is shown how these factors,
obviously responsible for a decrease in H, i.e. a “disorganizing” effect,
can be responsible also-under certain conditions and up to a certain
time or “dose” of noise-for an initial increase in H interpreted as a
process of “self”-organization. The autonomy-producing ambiguity is
expressed by a term of decrease in redundancy, while a second term in
the equation, of decrease in maximum-non-redundant-information
content expresses the destructive ambiguity. A given organization is
defined at least by three parameters, which determine the main features of
its characteristic function H(r). One of them is the initial information
content HO and has a structural meaning. A second parameter, with a
dimension of time, has the meaning of a functional reliability, related to
the overall resistance of the system to noise-producing factors. The third
parameter, namely the initial redundancy RO, is both structural and
functional in character, since structural redundancy is known to help
insure reliability. Various conditions on these parameters lead to various
kinds of organizations, with and without self-organizing properties.
t Present address: Universite de Paris VI, Faculte de Medecine Broussais Hotel-Dieu,
Service de Biophysique, 45, rue des Saints-P&es, 75270 Paris, Cedex 06, France.
T.B. 29s 20
296 H. ATLAN

Although the concept of organization is central in biology and widely used,


its meaning is far from clear. “Organization” is used to mean either a state
or a process, or both. What we have in mind generally is both structural
and functional in character. If, in the literature (Dancoff & Quastler, 1953 ;
Linschitz, 1953; Von Foerster, 1960; Rothstein, 1962; Von Neumann, 1966;
Ashby, 1967; Theodoridis & Stark, 1969, 1971; Eigen, 1971) we look at the
various proposed definitions of organization we find two major trends,
contradicting each other.
On the one hand, organization is meant as constraints between parts,
or regularity and order, where order is viewed essentially as repetitive order,
i.e. redundancy. According to this view a model for the best organized system
is a perfect crystal. On the other hand, under the influence of information
theory, organization is meant as non-repetitive order, which is measured
by an information content in Shannon’s (1949) sense, i.e. a degree of unex-
pectedness directly related to variety and unhomogeneity. Therefore, the ideas
leading to a formal quantitive definition of organization should involve a
kind of optimization process so that any optimum organization would
correspond to a compromise between maximum information content (i.e.
maximum variety) and maximum redundancy, and this is in fact the idea
that I propose to use as a basis for the definition of organization.
In a previous article (Atlan, 1968~) the following observation was made.
In a channel transmitting information between two structures x and y,
made of sets of elements or “letters”, Xi and X~ respectively [Fig. l(a)], the
transmitted information from x to y is equal to H(x; y) = H(y)-H(ylx)
according to Shannon’s theory (Shannon & Weaver, 1949) with
NY) = - c PW log, PC-9
j
and
WY14 = - 8 PWP(.$) 1% P(.il%

p(i) and ~0’) being the probabilities of xi and yi respectively, &Ii) being the
conditional probability of y, given some xi.
In other words, the transmitted information is equal to the information
content of y, minus the ambiguity due to the noise acting on the channel.
Now, if we look at the information content of the whole system S con-
taining the two structures x and y and the channel between them [Fig. l(b)],
we are dealing with a quantity obviously different from the information
transmitted from x to y. In fact, it is information transmitted in an implicit
channel from the whole system to the observer; the structure of the whole
system is an output of this channel, the input or source of which needs no
physical specification (see Dancoff & Quastler, 1965; Gatlin, 1966). It was
ON A FORMAL DEFINITION OF ORGANIZATION 297
shown (Atlan, 1968a) that the information content of S is equal to that of x,
i.e.
Nx) = - 7 P(i) log2 P(i),
plus the ambiguity H(ylx) :
m9 = fw + WY Ix>.
This is due to the meaning of this expression H(ylx) which is called the
ambiguity: it is the uncertainty left on y when x is specified, or, in other
words, the information content of y when y is viewed independently from X.
This is why, in the first case, where we consider information transmitted
from x to y, this expression is a loss of information and it is normal that it

(b)

FIG. 1. Opposite signs for the ambiguity H6.j~) when computed (a) in the information
transmitted from x to y; (b) in the information content of a whole system S containing
x and y.

bears a minus sign. But when we consider information of both x and y, as


in the second case, we must add to information content of x the information
content of y in as much as y is not a mere repetition of x, and this is why
the ambiguity then bears a plus sign.
Thus the ambiguity H(ylx) can bear opposite signs according to whether
we are interested in the information transmitted in a channel [Fig. l(a)] or
in the information content of a system containing the channel [Fig. l(b)].
Now, the ambiguity can be written as a function of noise, or even as a
function of time if we consider the effects of time as those of accumulated
random noise producing factors acting from the environment (Atlan,
196%): the larger the number of errors produced by these factors, the
larger is the ambiguity. By making use of generalized Yockey’s (1958)
298 H. ATLAN

equations
dp(jl
-- i) = -Jp(jli)
dt
+ $ J (1)
j
(where iVj is the number of different letters rj and Jan appropriate function),
differential equations can be written to express the ambiguity as a function
of a dose of errors accumulated in time (Atlan, 1968a,b, 1970). Thus,
according to the above-mentioned observation, two possible effects of noise
acting in two opposite directions can be considered, and correspond to two
kinds of ambiguities, one counted negatively and the other positively. The
first one is related to a “destructive ambiguity” and has the classical meaning
of a disorganizing effect. The second one is related to what we called an
“autonomy producing ambiguity”, because it acts by increasing the relative
autonomy of one part versus the other, or, in other words, by decreasing
the overall redundancy of the system.
This observation was used (Atlan, 196&z) to explain the apparently para-
doxical stimulating effect of low doses of noise-producing factors like ionizing
radiations, thermal noise, and time, on a system containing a channel
between two structures y1 and y,. The equation relating the information
content of the system to the dose of noise was written as
H(S) = H(Y,)-H(Y,IX)+H(YzlY,), (2)
where .X is the unspecified source in the channel to the observer, which can
be identified with the state of the system at the instant of time t-dt imme-
diately preceding the observation. By introducing equation (1) into (2) it
was possible to express H as a function of time.
As a result it was shown that a necessary condition for the existence of
an initial period of increase of H in time was a change in alphabet with
increase in the number of letters when going from y1 toy,. Thus the change
in alphabet with increase in number of letters which appears when one goes
from the four bases in nucleic acids to the twenty amino acids in proteins
was interpreted as a necessary condition for the positive effect of noise-
producing factors, to be able to overcompensate-at least up to a certain
point-their negative effect.
As was noted already this analysis was valid only for a system containing
not only one channel like y1 + y, but a very large number of structures
and channels between them, because otherwise there would be no sense in
viewing the information content of S differently from the information
transmitted in the channel y1 + y2. Therefore this analysis was somehow
artificial since it dealt with the effects of noise not on the real information
content of the system, but on the contribution of some particular channel
artificially isolated in the system. In particular the question of what happens
ON A FORMAL DEFINITION OF ORGANIZATION 299
when, due to an infinite accumulation of errors, y, eventually becomes
completely independent of yl, was left open until a more general treatment
can be proposed; and we were forced to use two different equations (Fig. 2
in Atlan, 196&z), namely equation (2) for the period of possible self-organ-
ization, i.e. increase of H, and a more classical equation previously used in
a theory of ageing (Atlan, 1968b), where the term corresponding to the
autonomy-producing ambiguity was deleted, for the subsequent period of
disorganization.
In the present note an expression of these two different effects in one
single equation is proposed to serve as the basis of a formal quantitative
definition of organization.
We start from Shannon’s definition of redundancy R, by H = H,,,,,(l -R),
where H is the information content of a message or system with internal
constraints between the parts, and H,,, is the maximum information content
computed by not taking into account the constraints, i.e. by assuming
complete independence of the parts. Differentiating H versus time-again
with the assumption that time means accumulated effects of noise-producing
factors-we get

‘; = (1 -R) ‘+ + II,,,,, @.

The two terms on the right-hand side can be identified with the two effects
of noise previously described. The first term has the meaning of a destructive
ambiguity which destroys H,,,, i.e. the total information transmitted to the
observer, counted without taking into account the constraints. It is the
classical disorganizing effect of noise. But, due to decreases in the con-
straints, dRfdt is negative, and the second term has the meaning of an
autonomy producing ambiguity which, as explained above, produces a
decrease in the redundancy of the system. In other words, as the accumulation
of errors acts by decreasing both H,,,, and R, the first term is negative and
the second positive.
Now, dR/dt and dH,,,,/dt are themselves two different functions of time,
fi and fi, which express the kinetics of the effects of noise on the system.
It is proposed that these two functions, together with equation (3), will
express the overall organization of the system, both structural and functional.
Different situations can occur. In some cases these functions can be such that
the variation of H in time, given by integration of this equation presents
an increasing phase due to the second term, i.e. the decrease in redundancy;
then H reaches a maximum and decreases when there is no redundancy left
to decrease from, and the predominant effect is due to the decrease of
dH,,Jdt. This case corresponds to self-organization since it shows an increase
300 H. ATLAN

in complexity, apparently spontaneous, while it is due in fact to the effects


of random factors on the system. After Ashby (1962) has shown that self-
organization in a strict sense (i.e. in closed system) cannot exist, we propose
to call self-organization a process where the change in organization with
increased efficiency, although it is induced by the environment, is not
directed by a programme but occurs under the effects of random environ-
mental factors. According to this view, a self-organizing system is a system
redundant enough and functioning in such a way that it can sustain a decrease
in redundancy under the effects of error-producing factors without ceasing
to function. This decrease in redundancy leads to an increase in information
content or variety which allows for more possibilities in regulatory per-
formances as shown by Ashby (1958). In other words, self-organization
appears as a continuous disorganization constantly followed by reorganiza-
tion with more complexity and less redundancy.
As a particular example, one can see what is the situation when fi and fi
are given the simple forms fi = - B, R and fi = - B,H,,,, so that R and H,,,,
are simple exponential functions R = R,, emBat and H,,, = H,,,, esBzz.
Then dH/dt is given by

= Ro(B1 +B,) e-(B1+B2)r - B, eeBzt

and becomes equal to zero for

f=f,=$ln [RO(l +$)I.

tM corresponds to a maximum following an increasing phase for H, as in


Fig. 2, providing its expression is positive, which means R, > B,/(B, + B2)
and this exemplifies the condition that the initial redundancy must be above
a minimum for the system to be self-organizing.

FIG. 2. Variation of information content versus time with an initial increase figuring
“self-organization”.
ON A FORMAL DEFINITION OF ORGANIZATION 301
More generally, the different features of our definition of organization
can be summarized as follows: (i) the process of organization of any system
is described by the variation of information content with time, N(t), which
itself is the combination of two related functions: the decrease of redundancy,
and the decrease of maximum information content. (ii) The kind or state
of organization of a system is defined by three quantities, namely H,,,,,,, the
initial maximum information content when redundancy is ignored, R,, the
initial redundancy, and tM, which has the meaning of a factor of reliability.
These quantities are characteristic parameters of the function H(t). Thus,
relations of order between different organizations-which would enable us
to ascribe a higher or lower “degree” of organization to a system-can be
defined only in a vectorial way since organization cannot be represented
by a single number, or a point on an oriented axis. Its mathematical image
must be a vector of at least three components, or a point in a three-dimensional
space. Among these three parameters-or components--H,,,,, is a measure
of the structural aspect of organization, while R, and tM measure its func-
tional features, with a kind of intermediate role for R, which has also a
structural character.
According to the different possible values for these last two para-
meters, R,, and tM, we can distinguish different kinds of organization,
which can be shaped by different curves, representing the variation of
information content versus time, as in Fig. 3. (i) Whether or not the
organization can exhibit self-organizing properties, depends on the value
of the initial redundancy R,-,. Since these properties are a consequence
of a decrease in R, a minimum value of R. from which this decrease can
start is necessary. When R, is below the minimum and there is no self-
organization, the H(t) function is monotonously decreasing and the rate of
decrease depends upon a reliability factor similar to tM. When the system is
self-organizing as represented in all the curves in Fig. 3, the value R, above
this minimum will determine the relative height of the maximum H,: if R, is
high, HO is low and its total possible increase up to & is large, and the con-
trary if R, is low (compare Fig. 3(a) and (b) which represent these two possi-
bilities). Therefore, the value of R, above its necessary minimum for self-
organization can be used to measure a potential for self-organization. (ii)
However, this parameter, R,, is not enough. Not all the systems having a high
initial redundancy, i.e. a high potential for self-organization, will have in fact
the same kind of self-organizing properties. The value of the reliability, i.e. the
inertia opposed to random perturbations, will determine differences between
different kinds of functional organizations. A very high reliability will
correspond to a very long phase of growth or increase in H(t); on the
contrary, a small reliability will determine a very short duration for this
302 H. ATLAN

FIG. 3. Different kinds of self-organization.

phase, so that H(t) will reach its maximum very fast, and then will decrease
as in any organized non-self-organizing system. Different combinations are
possible: high or low R, with high or low reliability, that is high or low
potentiaZ and long or short duration for self-organization [Fig. 3(a), (b),
63, WI.
The curve of Fig. 3(c) represents a particular case where R, is very high
but the reliability is very small, so that HM is reached very quickly and the
self-organizing properties disappear very fast. As a consequence, the system
does not seem to be self-organizing, even though its initial redundancy,
namely its potential for self-organization, was high. Thus, there are two
possible reasons for organized systems not to be self-organizing: either their
redundancy is too small and random noise-producing factors can only
destroy them-faster or slower according to their reliability-or, in spite
of an initial redundancy large enough, their reliability is not high enough
to allow for self-organizing processes to take place during an appreciable
period of time. In nature, crystals can be viewed as organized systems of
this type: because of their repetitive structure, their redundancy is very
large, but their reliability is low. This is manifested in their need for a
temperature low enough to maintain their structure. The potential barrier
ON A FORMAL DEFINITION OF ORGANIZATION 303
of their bonds must be sufficient to protect the crystal structure from the
effects of thermal noise. In other words, the crystalline structure can be
maintained only if the “dose of errors” is relatively low. If this dose-i.e.
the temperature-increases, it produces a very fast change in structure-
the resistance to random changes is very low-and a phase transition occurs,
at the end of which there is no more crystalline structure. On the contrary,
other repetitive structures-more flexible but less repetitive-like macro-
molecular systems can maintain themselves at a relatively high temperature,
that is in spite of a high frequency of random collisions strong enough to
modify their tertiary and quaternary structures. Thus a relatively high
“error” rate is present, but these “errors” do not destroy the whole system
at once. On the contrary, they decrease the redundancy and thus increase
the variety, i.e. the information content of the system; in contradistinction
to the situation in crystals, the phase transition here is not an all-or-none
process. Thus, we can see how production of information under the effects
of noise-producing factors is not as paradoxical and mysterious as it seems.
It is a consequence of “error” accumulation in a repetitive system, which in
addition is functionally reliable enough not to be destroyed at once by a
relatively low number of errors.
In other words, according to this view, the initial phase of increase in
complexity, which is a characteristic feature of growth and maturation of
living organisms, could exist in principle in any structure with a minimum
of redundancy, but would not be observed since it would be much too
short in simple systems which are only redundant. A certain degree of
functional complexity-reliability-is required to make this phase easily
observable. What is characteristic of living organisms is the existence of
different parameters allowing this phase to be more extended and distin-
guishable. But the principles of organization are always the same: provided
that its initial redundancy and functional reliability are large enough to
allow for an observable period of self-organization, a system can react to
random environmental stresses by an increase in complexity and variety,
so that it appears to adapt itself to its-even new-environment. Later on,
when the redundancy has been exhausted, the same random environmental
stresses are responsible for ageing and death.
All this can be used to give a more precise and quantitative formulation
to what Von Foerster (1960) called once an order from noise principle,
underlying such phenomena as adaptation, non-directed learning and
assimilation which, according to Piaget (1968), are all various facets of what
is called organization. As discussed elsewhere (Atlan, 1972) it can be applied
also to the understanding of the logics of evolution in which this principle
is obviously working.
304 H. ATLAN

REFERENCES
ASHBY, W. R. (1958). Cybernetica 1, 83.
ASHBY, W. R. (1962). In Principles of self organization (H. von Foerster & G. W. Zopf,
eds) pp. 255-278. New York and London: Pergamon Press.
ASHBY, W. R. (1967). In Currents in modern biology, Amsterdam: pp. 95-104. North-
Holland Publ. Co.
ATLAN, H. (1968~). J. theor. Biol. 21, 45.
ATLAN, H. (1970). Ann. phys. Biol. Med. 1, 15.
ATLAN, H. (19686). J. Geront. 23, 196.
ATLAN, H. (1972). L’Organisation biologique et la thkorie a’e l’information. Paris: Hermann.
DANCOFF, S. M. & QUASTLER, H. (1953). In Information theory in Biology (H. Quastler, ed).
pp. 263-273. Urbana: University of Illinois Press.
EIGEN, M. (1971). Naturwissenschaften 58, 465.
GATLIN, L. L. (1966). J. theor. Biol. 10, 281.
LINSCHITZ, H. (1953). In Information theory in Biology (H. Quastler, ed.) pp. 251-262.
Urbana: University of Illinois Press.
PIAGET, J. (1968). Lu naissance de 1 ‘intelligence chez l’enfant, 6th edn Suisse: Delachaux &
Niestle, Neuchatel.
ROTHSTEM, J. (1962). Philosophy Sci. 29, 406.
SHANNON, C. E. &WEAVER, W. (1949). The mathematical theory of communication. Urbana:
University of Illinois Press.
THEODORIDIS, G. C. & STARK, L. (1969). Nature, Land. 224, 860.
THEODORIDIS, G. C. & STARK, L. (1971). J. theor. Biol. 31, 377.
VON FOERSTER,H. (1960). In Se[forgunizing systems (M. C. Yovits & S. Cameron, eds).
pp. 31-50. New York and London: Pergamon Press.
VON NEUMANN, J. (1966). Theory of Self Reproducing Automata (A. W. Burks, ed.).
Urbana: University of Illinois Press.
YOCKEY, H. P. (1958). In Symposium on information theory in biology (H. P. Yockey, R. L.
Platzmann & H. Quastler, eds) p. 59 and 297. New York: Pergamon Press.

You might also like