Download as pdf or txt
Download as pdf or txt
You are on page 1of 130

Zero News Datapool, MANUEL DE LANDA

MANUEL DE LANDA

INTERVIEW

with Manuel de Landa,


Miss M and Konrad Becker

ECONOMICS,
COMPUTERS AND
THE WAR MACHINE

MARKETS AND ANTIMARKETS


IN THE WORLD ECONOMY

http://www.t0.or.at/delanda/ (1 z 2)2006-02-23 23:41:26


Zero News Datapool, MANUEL DE LANDA

THE GEOLOGY OF MORALS


A Neo-Materialist Interpretation

UNIFORMITY and VARIABILITY


An Essay in the Philosophy of Matter

MESHWORKS, HIERARCHIES
AND INTERFACES

VIRTUAL ENVIROMENTS
AND THE EMERGENCE
OF SYNTHETIC REASON

http://www.t0.or.at/delanda/ (2 z 2)2006-02-23 23:41:26


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

An Interview with Manuel de Landa

with Konrad Becker and Miss M.


at VirtualFutures, Warwick 96

Miss M.: What are your projects and general interests?

Manuel de Landa: Generally, I consider myself a philosopher, although I don’t have any
credentials, I’m more of a street philosopher. I am very interested in the Internet
because it is an international network of computers which organized itself. A very good
example of what processes of self- organization that is a very complex structure that
emerged pretty much spontaneously out of the activities of many different people, each
one trying to do something, but overall effect wasn’t intended or planned by anyone. So
as a philosopher, I am interested in all kinds of phenomena of self-organization, from the
wind patterns that have regulated human life for a long time, like the monsoon or
tradewinds which are self-organized winds, to the self-organizing patterns inside our
bodies, to the self-organizing processes in the economy, to the self-organizing process
that created the Internet. So that is my general philosophy and I am interested in the
Internet because it is a very concrete example of what I am talking about.

Miss M.: You are presenting a concept which is not quite standard, that of markets and
anti-markets. Could you explain that a little bit?

Manuel de Landa: The reason why the concept of self-organization is not very well
known is because it is only about 30 or so years old. It caused a great revolution in
science in very different disciplines like physics, chemistry and the dust is just starting to
sell and so we are starting to see what the consequences of this revelation will be for
human societies. One of the areas that will be influenced, in fact, that is already being
influenced is economics because what we are talking about is here is order that has
come out not because someone planned it, because someone commanded it to its
existence. We tend to think that everything about human society which has a certain
amount of order as being the result of someone planning it. For instance, the city of

http://www.t0.or.at/delanda/intdelanda.htm (1 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

Versailles was perfectly planned up to the last little detail by Louis XIV and his ministers,
and that is our image of what human society is. That everything is on purpose. There
are collective actions and consequences which are unintended, and whatever order
there is in those collective consequences that no one planned is self-organizing. The
clearest example of that is markets. Let’s understand them to have a very concrete
image: peasant or small-town markets, a place in town where everybody goes and
brings their stuff to sell or goes there to buy something, and it meets every week in a
certain part of a town and it comes apart and then meets again the following week. In
those very specific places, everybody shows up and everybody shows up with their
intentions: I go there with the intention to buy, or I go there with the intention to sell. So a
lot of what happens is planned, is intentional, but the overall effect, for instance the
prices that every particular commodity happens to go by is unintended. In a real market,
no one actually sets the price. There is no one buyer or seller who says, „I want this to
be the price of this." No one commands the price, prices set themselves. That’s what’s
interesting about markets, that they indeed provide you with a coordination mechanism
with coordinating demand and supply that does not need a central decider, does not
need a centralized agency that does decision-making. Out of this centralized decision
making order comes out.

This is not a new idea, of course. Adam Smith, at the end of the eighteenth century,
came up with the idea of „The Invisible Hand" which was supposed to explain how
markets are organized. My point of view is those theories are obsolete, that indeed, only
with the new conceptual technology, that the new concepts of self-organization that
have developed in the last thirty years can we understand how markets actually work.
So that is one way in which these new theories will affect our lives, allowing us to
understand better how economies work. On the other hand, another problem with
original Adam Smith idea was not so much that it was too simple, but that it applied the
term „markets, to things that were not self-organized. All the way back to Venice in the
fourteenth century, Florence in the fifteenth, Amsterdam in the eighteenth, London in the
nineteenth, in other words, throughout European history, beside these spontaneously
coordinated markets, there have been large wholesalers, large banks or foreign trade
companies or stock markets that are not self-regulated, these are organizations in which
instead of prices self-regulating it, they had commands. Everything is planned from the
top and more or less executed according to planned, everything is more or less
intended. There is very little self-organization going on at all. And indeed, these large
wholesalers, these large merchants, large bankers and so on, made the gigantic profits
they made and they became capitalist thanks to the fact that they were not obeying
obeying demand and supply, they were manipulating demand and supply. For example,
instead of the peasant that shows up to the market to sell a certain amount of corn, here
you have a wholesaler with a huge warehouse where he stores all the corn he can. If the
prices are too low, he can always with drawn certain amounts from the market, put them
in the warehouse, and artificially make the prices go up. When the prices go up, he then
sells the rest of the corn at these high prices and he makes a lot of money. But, of

http://www.t0.or.at/delanda/intdelanda.htm (2 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

course, he is manipulating demand and supply. He is not being governed by these


anonymous forces. He is not being subject to self-organization; he is organizing
everything in a planned cunning way. And so, because economists use the word
„market" to describe both, that is one of the main confusions I see in contemporary
thought.

We need another word to describe these organizations that are large enough to
manipulate markets. A word has been suggested by historian Fernand Braudel and it is
a very simple one: „anti-market." Why? Because they manipulate markets. And so
today, in the United States, there is a very strong political movement, mostly by the right
wing, and Newt Gingrich is perhaps the most well known politician in this regards, who
are trying, as they say, shrink the size of the government, let market forces have more
room to operate. But, of course, translated into the terms we’ve just introduced, what
they really want to do is let anti-market forces run wild. They don’t really want small
producers and small manufacturers and bakers and printers and mom-and-pop shops to
have more room to manoeuver and make money. They want national and international
corporations to have more room to manoeuver. They want to shrink government so that
there are less regulations to keep international and national corporations from doing
what they want. But if you go and study one of these corporations, rather than looking
like a market, they are like mini-Soviet Unions. I mean, everything is planned in these
corporations. The managerial hierarchies are exactly like the hierarchies in the Soviet
Union: they planned everything, prices play a very small role and most of the
organization is done via command.

Now, we used to call the Soviet Union a „command economy," we still call China a
command economy. Well, international corporations and national corporations are
indeed command economies. They have very little to do with prices. In the past they did
have something to do with prices, because either their suppliers or distributors were little
guys. They had to deal with prices one way or another. But since the nineteenth century,
at least in the United States and I’m sure in Europe, a lot of organizations had been
internalizing: buying their suppliers and their distributors and making them part of
themselves, kind of eating them and digesting them and incorporating them into their
own tissue. The more they do that, the more they internalize these little markets, the
less will prices play a role in their coordination, the more the commands play a role. I
guess a good image for this that the United States is far from being a free enterprise
economy, it is an economy run by multiplicities of little Soviet Unions.

Miss M.: Coming back to the Internet, how would you apply that concept to the Internet?

Manuel de Landa: Well, the Internet, precisely because it is a self-organizing structure,


benefits in the first place small producers, in this case, small producers of text, since that
is pretty much the only thing you can sell now on the Internet. I mean, you can sell a few
services, searching and Federal Express sort of tracking your packages and so on, but

http://www.t0.or.at/delanda/intdelanda.htm (3 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

really the main commodity on the Internet right now is text and images. So the Internet
by its very nature, benefits small producers of content. However, right now we are at a
stage in its development for where anti-market forces are making an entry into the
Internet in a big form. Not only Microsoft, not only America On-Line, not only AT&T and
so on, but all kinds of corporations have their own web pages and are utilizing the web.
So now you have advertising as a way of reaching consumers in a more direct way. So
the Net could be changed in a few years from a self-organized entity into a planned
entity. I don’t have any romantic views about the Internet being able to resist all the
forces of planning. Self-organizing forces can be overrun by these planned forces
because they are so gigantic. The Internet serves a lot of functions, let’s just talk about
its economic functions. At the moment that electronic cash and cryptologically secure
methods of sending credit card numbers becomes secure, becomes established,
becomes standard software, then the internet is going to be a place to do business. So
the question is how do we have to guide its evolution so that it ends up benefiting small
producers of content instead of the people who already own the infrastructure: the
telephone companies, the cable companies and so on which can also own the content-
producing companies since they can always internalize them. The content-producing
companies, let’s say Yahoo, which is an index-producing company, even if they’ve
emerged in a market way---as spontaneous, entrepreneurial, and small---Microsoft can
buy Yahoo any time, any time they want to. They haven’t bought them but they could
buy them. And so the question is that Internet is not protected by this in its self-
organization, it could be swallowed up anytime. So the question is what can we do—not
you and me--but everybody who is involved in the Internet. The thing is that in the next
few years the transactional structures will be defined—that is, what kind of transactions
will happen over the Internet.

To give you a very concrete example, if the standards that are settled are such that the
minimum transaction is five dollars per transaction, much like there is a minimum
transaction for credit cards now, that will benefit large producers of content. The only
way it will benefit small producers of content, independent freelance writers and so on, is
by making the minimum transaction very very small. We need to make the infrastructure
of the Internet and the software that be running these transactions be capable of
tracking one-cent transactions so that the small producers of content can sell one page
of their essays. So you can have your home page and people don’t have to buy your
whole essay in advance. They can read the first page, you can charge them one cent
and if they want to buy the whole essay because they are interested, then they buy the
rest. Again this is a very simple example and the minimum transaction that is allowed by
the software—the standards will be decided in two or three years and they will be written
in stone—this is something that matters. To benefit small producers you need to allow
very small transactions. If the minimum transaction is large, you will be automatically
benefiting large producers of content.

So issues like this will be decided in the next few years. Another issue that will affect the

http://www.t0.or.at/delanda/intdelanda.htm (4 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

economics of the Internet and whether market or anti-market forces end up winning is
the main scarce commodity on the Internet at the moment: bandwidth, the amount of
information which can run through the channels. Right now, if you have a regular
modem like a 14.4 modem, you know how long it takes for every web page to come into
your system. It is precisely because bandwidth is scarce. It is a trickle of bits that is
going through this channel so by the time you get to your computer, it takes forever to
refresh the screen. For the Internet to become a place where content can be transacted
as a commercial transaction, bandwidth needs to be plenty. The more bandwidth there
is, the more complicated your documents can be and the more complicated the kinds of
services you can offer over the Internet. Right now, the amount and the kinds of services
you can offer are very limited by the bandwidth. So bandwidth is a key issue.

Now there are several forms of bringing cheap bandwidth. There is one form—telephone
companies own a very large portion of the fiber-optic cables that are underneath much
of the United States and large parts of Europe, too. Fiber-optics as opposed to copper
cables, can transmit an enormous amount of bits, and all kinds of parallel channels at
the same time. So, yes, fiber-optics is one solution to the bandwidth problem. But, now,
how do we use it? One solution would be—and this was proposed by a right-wing
economist—is to let gigantic telephone companies who own the fiber-optic cables to
merge together and with cable companies which own the copper which goes from end of
the fiber to the home. Right now, the telephone companies own one part of the thing, the
cable companies own another part, and if we allow them to come together into a hugh
anti-market institution, they would give us cheap bandwidth. But what price? At a price
that one anti-market institution would own the entire infrastructure of the Internet. That
would not be good. Because at one time in the future when they would want to, they
could make the Internet asymmetric—to allow more bits into your computer than you
can let out. Right now the Internet is pretty much symmetric: you can consume bits but
you can produce bits and that is what makes it unique, that it is a two-way system as
opposed to television which is a one-way channel. The telephone companies and the
cable companies could pretty much own the infrastructure of the Internet and at any
point they want to, they could change it into an asymmetric design in which there are
more bits coming in for you to consume—and again they transform it again into a
consumer media—and very few bit go out so if you wanted to produce, you could
produce but painfully.

So, obviously, that’s not the right way of getting cheap bandwidth. The best way to get
cheap bandwidth is for the government to force telephone companies to rent fiber-optic
space to independent little companies. This is something the government is going to
have to understand—that a big monopoly made up of a fusion of telephone companies
and cable companies would simply become a gigantic entity that would become a
powerful company which would be more powerful than the government itself. The only
way to dissipate this danger is precisely by letting the small producers of content rent
their own space in these fiber-optics things by forcing telephone companies to rent it out.

http://www.t0.or.at/delanda/intdelanda.htm (5 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

So they still would own it but they would not control it. We would then be able to get
cheap bandwidth without the danger of this one huge monolithic company owning the
guts of the Internet.

Miss M.: But looking at the situation now, we are already confronted with gigantic
monopolies on the Internet. One example, Microsoft, who are now developing their own
Internet browser, their own web browser, and Netscape, who were for some time looked
at as the „good ones" because they were not Microsoft, are now becoming the „bad
ones" because they are starting to monopolize as well. We already have the monopolies.

Manuel de Landa: Yes, absolutely. I had said in the beginning that we are at a
threshold now where anti-market forces are about to enter big-time into the Internet. The
question is whether the all the grass-roots parts of the Internet will thrive on it, whether
the mostly grass-roots network of bulletin boards or whether the different European
parts of the Internet are robust enough, are strong enough to resist the attack. I do not
believe that because Microsoft is extremely powerful necessarily means that it will win
the war. It certainly has more chances of winning it than any one of us has, but the
question is whether those areas of the Internet which were of grass-roots origins will be
able to stay there and sustain the spontaneity and originality of the Net. And whether
Microsoft Network will simply become kind of a fancy neighborhood like America On-
Line, where they have doors to the Internet but are not really owning the Internet.

On the other hand, what could happen is that what we know as the Internet will become
a sort of ghetto: it still survives but it is surrounded by the fiber-optic infrastructure of
what they call the Information Highway which is owned by the Microsoft’s and the MCIs.
All the business and secure transactions would be conducted around there and one
would be offered entry this ghetto where all the artists and bohemians are. The Internet
becomes this Greenwich Village where you go for a cafe where you go to hang out the
hip people, then you’d go back to where the newer fiber-optic networks are to do
transactions, to do business.

The gain is not won by either side. If we could manage to force into the standards of the
Internet that small transactions be allowed, it would be a victory for all of us because it
would benefit small producers of content. that would not be winning the war. This is
going to be one little battle at a time.

Miss M.: I would like to bring in Konrad Becker. You are proposing this theory of
propaganda, that the Internet is a tool of propaganda. How can we merge the two
things, the monopolies on one side and the propagandistic possibilities on the other side
with the monopolies controlling the Internet?

Konrad Becker: I do see the Net as a propaganda tool, or as the controller of

http://www.t0.or.at/delanda/intdelanda.htm (6 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

intellectual stimuli, in different ways and it has its origins in the military apparatus.
Hypermedia as such is an invention of the military apparatus to deal with complex
information structures. So we learn today that information as such is more or less a
myth. „Consensual hallucination" is the generic term for it. The possibility of reaching
people through networks are quite tremendous; there is a lot of talk that the Third World
War will be an information war. We are getting to that point that there is non-lethal
warfare, that is propaganda warfare to be used through networks.

Miss M.: Manuel de Landa, you are an expert on warfare.

Manuel de Landa: I can see his point very well. Indeed, just the increase of enormous
advertising you see on the Web is pointing in that direction. Again to me the key thing
here is whether we keep the network symmetric or asymmetric, whether the same
amount of bits can come into your computer is the same as the amount that can leave. If
it becomes asymmetric, if more bits can come in than can go out, then its transformation
into a propaganda tool is already a reality. If it becomes a consumer tool and big
producers of content simply send you information to consume, that’s it: that is
propaganda, that is a means of control. It is much better than television, it is much more
pointed. It is like personalized television: they know who you are, they know your tastes,
they may be even able to track and make consumer profiles out of your Web surfing,
therefore put you on a mailing list for very specific people to send you information.

On the other hand, if the cables and the connections are kept symmetric so that criticism
can leave the terminal as propaganda comes in, then we have a fighting chance. That
doesn’t mean we will win the war or have utopia or anything. What we can hope for is
having a fighting chance. That’s what would make the game still interesting. It would
make it interesting for all us critics, for social critics to be engaged if we feel we still have
a fighting chance. I believe it all boils down to hardware. If the hardware remains
symmetric, and again that is not a necessity--not something that will happen but could
change, then we have a fighting chance because we could fight propaganda with
criticism.

Miss M.: Konrad, do you think that being able to criticize is a remedy against
propaganda?

Konrad Becker: Well, actually, that could be a trap as well. As Manuel de Landa
explained in his lecture, criticism is part of the scheme. So it is easy to fall into that
hypnotic situation where you have an oscillating paradox: on one side you have right
wingers and on the other side you have left wingers and you are in the middle, totally
snake-charmed. I sometimes feel that propaganda is a secretion of a higher order. It is
natural for polar theoretical points to embattle themselves, which doesn’t change the
course of history. To find means of escape from this hypnotic lock, this propaganda lock

http://www.t0.or.at/delanda/intdelanda.htm (7 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

of the informational organism or info-body would be the way to go. These heterogenic
activist situation that you find in the early Internet is one of the very interesting points
where there could be the window of opportunity to break the gridlock.

Manuel de Landa: I absolutely agree. Right now, everybody and their mothers call
themselves „critical". You go to New York City bookstores and there are huge sections
of bookshelves called „Critical Theory". Of course, those theories are the most uncritical
in the whole world. They call themselves critical but in the end are so uncritical because
they take for granted all kinds of assumptions, they don’t really criticize themselves, and
so forth. What I meant when I said „criticism as an antidote to propaganda" would be a
new type of criticism that is much more theoretically grounded and goes beyond the fake
kind of criticism, or dogmatic criticism, that we have become used to. If we criticize the
Internet by simply calling it a „capitalist tool" or a „bourgeois tool". That was a standard
way in which Marxists criticized things in the past, attaching „bourgeois" to everything
they wanted to criticize and, bingo, you had instant criticism like you had instant coffee.
Obviously, that kind of criticism is not going to do anything, and indeed has become a
kind of propaganda itself. I mean, you criticize to propagandize your own idea. The
question in front of us now as intellectuals is whether we’ve inherited so much bullshit
and therefore our criticism is bound, is condemned to be ineffective, or whether we can
find ways out or escape routes out of this and create a new brand of criticism that
recovers its teeth, its ability to bite, its ability to intervene in reality in a more effective
way. Again, it would imply a collective effort of a lot of intellectuals who are fed up with
what had been labeled as criticism in the past and is nothing but dogma and repetition,
and come up with a new brand of criticism that is capable of fighting propaganda.

Konrad Becker: The right wing idea of the „Invisible Hand" you mentioned is a good
point to make. And you mentioned the „commodifiers" on the other hand. You were
focusing more on the faults of the „Invisible Hand" than you were on the faults of the
„commodifiers". One mistake I did mention was the trap that one could fall in. Could you
please specify?

Manuel de Landa: We need to distinguish true criticism from false criticism, or at least
criticism that more or less gets it right from criticism that is pure dogma. Belonging to the
left, as I suppose we do, we are more affected by the kind of dogmatic criticism labeled
as „leftist" rather than „rightist". Most people who are radicals or who are artists or who
are rebels in some way would hardly ever use the term „free enterprise" to talk about the
United States. We are not affected by that dogma, but we are affected by another
dogma which is Marxist dogma. Going back to the distinction between markets and anti-
markets, the problem with Marxist dogma is that it fails to distinguish between self-
regulating economies where there is no where there is no economic power and the anti-
market institutions where economic power is the center and the reason for being of
those institutions. Failing to distinguish that means that, for a Marxist, the very fact that
an object acquires a price, that an object goes into a market to be sold as a commodity

http://www.t0.or.at/delanda/intdelanda.htm (8 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

is a bad thing. Remember that Marxists proposed that the solution to our problems is
Socialism and that Socialism is a society where commands have replaced prices
completely. But if I am right, that every national and international institutions are mini
Soviet Unions, mini Socialisims minus the idealism, and that they replace prices with
commands only on a smaller scale, then obviously Socialism is not the solution.
Socialism is indeed an intensification of the trend that is enslaving us.

When the Soviet Union dissolved a few years ago, people in Holland and
Czechoslovakia began talking about the transition to a market economy. Everybody was
talking about how hard it was to make a transition to a market economy. But look at
what they were trying to do: they weren’t trying to make a transition from a planned
economy to one in which many small producers dispersed. No, they were trying to make
a transition to a few large enterprises which were not owned by the government
anymore but were still large, run by a hierarchy of managers with everything
commanded and everything planned. In other words, they were trying to imitate the
United States, that is, they were trying to imitate an anti-market economy. However,
because of the dogma we have inherited, we accept immediately that their intentions
were to go a market economy.

What I am trying to say is that one of the obstacles to think straight in this regard is the
idea that the very entry of an object into a price system is a bad thing. When you have
an object, you give it a price and sell it in a market, it becomes a commodity. The
process by which an object acquires a price and enters a market is called
„commodification". It really should be „commoditization" but „commodification" is a word
that stuck in the Seventies. But today it has become a cliché, a thing you repeat without
thinking. You believe you are criticizing the system when you say something has
become commodified but, indeed, you are not saying anything, if I am right. If we are
supposed to distinguish markets from anti-markets, then to say that something has
become commodified doesn’t even begin to say anything. We still don’t know if it
entered the market as a free commodity, so to speak, which will be affected by supply
and demand, or if it entered the market as a manipulated commodity. When we tend to
think of ideological effects of commodities, we tend to think of planned obsolescence—
creating consumer products that are planned to break down so you have to buy more
consumer products. Or we tend to think of tailfin designs like in the 1950’s, when they
weren’t making innovations on cars, they just putting larger and larger tailfins. But these
kind of innovations, these cosmetic innovations which aren’t market innovations at all,
but are anti-market commoditizations. If we were to use that term in any way that would
be meaningful, it would be to refer to certain products of anti-markets which are
specifically planned and designed to manipulate consumer needs. We would need to
think of McDonald’s burgers. The moment you have a burger war on TV between
McDonald’s and Burger King, or the famous cola wars between Pepsi and Coke, that is
commodities in the bad sense, in the Marxist sense. Objects that have zero use value
are nothing as technological innovations, they are pure cosmetics, pure simulacrum,

http://www.t0.or.at/delanda/intdelanda.htm (9 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

pure consumerism. But, again, if you trace those products to their source, you will find
that in a majority of cases they come from anti-markets. And if you trace every
technological innovation, from the steam roller to all the little machines and procedures
that were needed for the Industrial Revolution, electricity and so on, they will have
almost always have come from small producers. The majority, again.

Konrad Becker: The subtitle of one of your essays, „The Geology of Morals," which I
very much appreciated, is called „The Neo-Materialistic Interpretation". How do you
interpret „Neo-Materialism"?

Manuel de Landa: Obviously, I put the word „neo" there to distinguish it from Marxist
materialism. The only good thing that Marxism ever gave us was its materialism, the
idea that we need to explain things that happen right here without appeals to God,
without appeals to Platonic essences, without appeals to anything transcendental. Of
course, Marxists, to the extent that they bought Hegelian dialectics, were never really
true materialists since dialectic is not at all something of this world. That was the good
thing about Marxism. Lenin, as much as I hate many of the things that he did, as a
philosopher was not so bad. He resisted idealistic philosophies in which everything is
just our perceptions of it. By saying that there is a material world out there, we need to
take that into account, that not everything is information, that not everything is ideas,
that not everything is cerebral stuff. That there is also concrete fleshy stuff that is also
very important to consider.

I call it „neo-materialism" because it is a form of the old philosophy called materialism


but it is new in that, by incorporating theories of self-organization, matter and energy
themselves without humans and even without life are capable of generating order
spontaneously. This can be seen in lava or in winds or in many phenomena in our
planet. For instance, a hurricane is considered by meteorologists as a self-assembled
steam roller, running things through a cycle of cold and hot, running out of energy, and it
keeps its shape long enough for us to name them—we give them names, we ran out of
women’s so now we’re giving them men’s names. We give them names because they
are creatures which inhabit the atmosphere. However, these are creatures that create
themselves. They don’t have genes, they don’t have anything that tells them what to do.
They are completely spontaneous creatures.

Now, neo-materialism means acknowledging that we have been neglecting matter for a
long time, all the way back to Aristotle. Aristotle already separated formal causes from
material causes in his classification of causes. It is a classification that has stayed with
Western philosophy all the way up to the 20th century. All the way back to the Greeks,
matter was seen as an inert receptacle for forms and humans came up with these forms
that they imposed on this inert receptacle. This has certain class or caste origins
because all the way back to the Greeks, the blacksmith, the guy who worked on matter
or metals, lived outside of the city, spent all day in front of the fire, dealing with metals,

http://www.t0.or.at/delanda/intdelanda.htm (10 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

and most importantly, didn’t come to the arbora to talk. So the citizens of Greece didn’t
trust the blacksmith: „He doesn’t talk. He doesn’t come here and blabber." They were
always slaves or ex-slaves, and manual labor, even if it is crafty manual labor—be it
women cooking in the kitchen or blacksmiths working with metal or folk artists creating
objects—was considered a secondary activity, a lower activity. Intellectual activity was
the real activity, to think with concepts. Neo-materialism means to recover the feeling
that we have been neglecting these people for a long time, that there is a much more
interesting form of knowledge that relates to this skill to deal with matter—in the kitchen,
in the blacksmith’s shop, in the carpenter’s shop. To deal with representations and
concepts and mental stuff is interesting, too, but is just as interesting as this direct
sensual knowledge of matter.

This has a lot to do with the Internet because the people who created the personal
computer—and without personal computers we would not have the Internet—were
Steve Jobs and Steve Wosnik from Apple. Steve Jobs was the mental guy, and Steve
Wosnik was someone who thought and spoke very little, was the blacksmith of our time,
who simply had a sensual love and had a neurotic approach to chips. He collected
chips, he kept chips under his bed, and he created Apple though he didn’t have an
engineering degree. He didn’t have any formal training yet he created the first Apple II
out of this sensual knowledge of electricity, of electronics and of chips and began the
revolution, if it does indeed turn out to be a revolution. So sensual knowledge is not
something we left in the past, in the age of craft activity. It is something that is still very
much with us. If you study the history of the transistor, even though it uses some
quantum theories, it uses it after the fact. If you see photographs, it is just a chuck of
silicone stuff with things in it—it looks so funky, so home-made. It was home-made.
These guys were tinkering with matter. They were trying to understand matter by
tinkering with it instead of imposing a pre-conceived idea or form on it.

Konrad Becker: I agree that Aristotle has the position of a creep in history. But isn’t the
blacksmith traditionally associated with the position of a shaman, the one tinkering but
also tinkering with concepts in a way? He was also a psychedelic expert, as far as I
understand. So going back to the military complex, we had a discussion the other day
where we noted that the huge American military complex is rather helpless to people
who believe that if they take a bomb onto their body and go into a building, they will go
straight to heaven. There is something of immaterialistic value in there on a very
practical basis. On the other side, we see an immaterialization of economy as such from
the basic fact that value of money is pulsed in magnetic highways through the globe.
What is the other side of the materialism? Don’t it you think it is becoming more and
more important?

Manuel de Landa: Absolutely. The moment in Amsterdam in the 18th century where
paper money for the first time became as important as metallic money, there was a de-
materialization right there. All of a sudden, money became purely information. Dollar

http://www.t0.or.at/delanda/intdelanda.htm (11 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

bills now came to have value because of and effect of confidence. As long as everybody
believed you could go to the bank and get your gold’s worth of things, then everything
worked just perfectly. So I agree with you: on top of the material layer, we have been
building these virtual layers one at a time. Virtual money—we don’t have to wait for e-
cash—paper money is virtual money. The more knowledge becomes an input to
production, has become as important as the material inputs to production—the energy,
the labor, the raw materials that go in—then knowledge is probably as important as
those inputs. So I completely agree with you. My point is, rather than making matter and
energy the whole thing, that they remain the basis for society. United States and Europe
consumes a much greater amount of energy in the form of electricity than they’ve ever
consumed in the past. If you go back all the way to the Industrial Revolution, the
increasing consumption of energy is a steep curve and there is no end in sight. My
thesis is that all these virtual layers we’ve built since the Industrial Revolution, precisely
to the point that our energetic basis has become more and more important--we became
aware of this during the 1973 oil crisis when we became aware how much our entire
society depended on cheap energy, being able to buy cheap oil from the Arabs. The
Arabs said, „Well, no one else is going to sell them cheap oil that cheap," there was a
big crisis. If everything had become virtual already, we would not have had that crisis. It
would have been very easy to switch to another energy source, or use our knowledge to
do something else. The point of the matter is we are much more primitive than we think
we are, much more material. This translates immediately to the fact that, for all the
virtual stuff that happens on the Internet, all the web pages and links and so on, there is
still all the hardware that forms the basis of it, all of which runs on electricity. In other
words, there is this material basis that is so humble that we take it for granted, because
it is not the exciting part. The exciting part is the virtual level. But we shouldn’t take it for
grant because if our philosophy tomorrow is to get it right, we need to always
acknowledge this material basis on which we operate.

The token material entity of current textual theory—just to back track a bit—the ‘60’s in
France was the great period of virtualization. Everything became text. Kristeva and
Derrida and so on we just talking about intertextuality. Even the weather doesn’t exist, it
is what we make of it, what we interpret of it. Everything became virtual in a way.
Baudrillard say that everything is just simulacra, just layers of neon signs on top of
layers of television images on top of layers of film images and more and more virtual
stuff. The computer games and simulations. We need an antidote to that. We need to
acknowledge that we’ve built these layers of virtuality and that they are real, they are
real virtual. They might not be actual but they are real still but that all of them are
running on top of a material basis that ultimately informs the source of power and the
basis of society.

Konrad Becker: What you were proposing in „The Geology of Morals" is that if you
extend that metaphor further, you could go into dangerous waters. There have been
proposals along that line from various corners, from the Gaia hypothesis to the bio-

http://www.t0.or.at/delanda/intdelanda.htm (12 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

organism theory. Is there a certain point where you would say you wouldn’t want it to be
interpreted in this way?

Manuel de Landa: Absolutely. Don’t call me Gaia. The Gaia hypothesis is a very
interesting point. I am anti-Gaia in that it is such a romanticization of our planet, but
beside the romantic aspect, which isn’t necessarily that bad because New Age people
are at least motivated for something, trying to save Gaia. Perhaps that will have an
impact on the ecological movement by recruiting more people. We have very serious
ecological problems, all of are material, none of which are virtual, all of which have to do
with the fact that we never learned how to deal our material fluxes. We know how to deal
with the fluxes that come in, the water that comes in, the electricity that comes in, but all
the residues and all the garbage and all the material fluxes that go out simply just go
out. We send all this carbon dioxide into the atmosphere, and we pollute our seas and
pollute our rivers because we are ignoring certain material fluxes that are still operating
of our material world. So the Gaia hypothesis, as romantic as it might be to the extent
that it inspires a few New Age people to become more actively involved in environmental
things, it might be a good thing.

Philosophically, it is a terrible mistake. It is a terrible mistake precisely in the neo-


materialist sense because it takes the metaphor of the organism, it sees life, living flesh
as the most magical thing that happened on this planet. This is of course a chauvinism,
a kind of organic chauvinism on our part. It takes the metaphor of the organism and
applies it to the whole planet. Now the whole planet is alive, that what Gaia is. Not only
do you call it an organism, you also give it a goddess name just to make sure you are
ridiculous enough. The way out of this is to think that the planet is indeed something
special, but it what Deleuze and Guttari called a body without organs, which is the exact
opposite of an organism. It is a cauldron or receptacle of non-organic life, a body without
organs. Because it can be alive in the sense of being creative and generating order
without having genes or having organs or being an organism. In my view, the very fact
that the atmosphere connected with the hydrosphere can generate things like hurricanes
and cyclones and all kinds of self-organizing entities means that indeed the planet, even
before living creatures appeared, was already a body without organs, a cauldron of
creativity, a receptacle of spontaneously emerging order.

Compared to the atmosphere, compared to the freedom of a hurricane, an organism,


even though it is more complex and it lasts much longer, at the same time has more
command components than self-organizing components. The command components
come from genes. Your genes are the one that tell your flesh , „You are human, you are
not a dog , you’re not a monkey, you’re human and your shape is like this. You don’t
have six fingers or seven fingers, you have five fingers. If you have six fingers, you’re a
freak, you belong in a circus, you’re not a real human."

Genes are what keeps this flesh that’s self-organizing and full of non-organic life from

http://www.t0.or.at/delanda/intdelanda.htm (13 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

within in order. And the genes are, of course, completely hierarchical, the genes within
the cell obey the cells within the tissue, obey the genes that make up the organs, obey
the genes that make up the organism. We are like perfectly little hierarchical creatures
which the command component is much higher than the self-organizing one, whereas
the atmosphere s much freer, is really pure self-organization without any command
component at all because there are not even genes there that tell them what to do. And
so to call the planet Gaia, to make it an organism, besides being romantic and kind of
cheesy, is a philosophical mistake from the strictly materialist point of view.

Konrad Becker: Yes, but don’t you think that the boundaries between life and animate
and inanimate matter is getting fuzzy and the famous virus is a borderline product where
we are not sure if it is life?

Manuel de Landa: Absolutely. The lines are fuzzy but only when you compare viruses
with, let’s say, crystals. Crystals also grow by replication, viruses also grow by
replication. In a way, a virus is just a fancy crystal. But when you compare creatures
farther away from the border, say, what we call higher animals like ourselves, you can
see the difference with hurricanes. The command component in the mixture increases
with evolution and the higher you get in the evolutionary tree the command component
increases. We call it „higher" but we are not any higher than bacteria, we just happen to
believe we are special. So to use the organism as a metaphor is wrong.

Konrad Becker: But would you consider the possibility of non-organic life?

Manuel de Landa: Yes, I believe a hurricane represents a form of non-organic life. It


lasts long enough for us to give it a name. It assembles itself. It’s not living in the sense
that it doesn’t breathe. But to ask it to breathe would be to impose an organic constraint
on it. The thing doesn’t have to breathe, it doesn’t have to have a pulse. Even then,
certain winds do breathe, say the monsoon, the wind that is most prevalent on the
southern coast of Asia. It is a perfectly rhythmic creature: it blows in one direction for six
months of the year, blows in the other direction for another six months, and every sea-
faring people in Asia that made a living from the sea had to live with the rhythm of the
monsoon. The monsoon gave those cultures their rhythm. If you want to go that way,
well, you have to go that way in the summer, then you get there and you have to wait for
the winter to come back. You have to plan your life to that rhythm. So the monsoon is a
self-organizing entity, there is no command component at all, it is non-organic life, and it
is a pulsing non-organic life. It even has the beat that we tend to associate without
hearts.

As it turns out, when the theorists of self-organization studied cardiac tissue, they
discovered that you can take the cells of the heart, put them in a little flat plate and
they’ll keep beating. In other words, what is making the heart beat is a similar process

http://www.t0.or.at/delanda/intdelanda.htm (14 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, An Interview with Manuel de Landa

as to what is making the monsoon beat. It has very little to do with genes, that the flesh
itself can beat, that non-organic life has this breathing rhythm all to itself.

http://www.t0.or.at/delanda/intdelanda.htm (15 z 15)2006-02-23 23:41:36


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

Economics, Computers and the War Machine.

by Manuel De Landa.

When we "civilians" think about military questions we tend to view the subject as
encompassing a rather specialized subject matter, dealing exclusively with war and its
terrible consequences. It seems fair to say that, in the absence of war (or at least the
threat of war, as in the case of government defense budget debates) civilians hardly
ever think about military matters. The problem is that, from a more objective historical
perspective, the most important effects of the military establishment on the civilian world
in the last four hundred years have been during peacetime, and have had very little to
do with specifically military subjects, such as tactics or strategy. I would like to suggest
that, starting in the 1 500's, Western history has witnessed the slow militarisation of
civilian society, a process in which schools, hospitals and prisons slowly came to adopt
a form first pioneered in military camps and barracks, and factories came to share a
common destiny with arsenals and armories. I should immediately add, however, that
the influence was hardly unidirectional, and that what needs to be considered in detail
are the dynamics of complex "institutional ecologies", in which a variety of organizations
exert mutual influences on one another. Nevertheless, much of the momentum of this
process was maintained by military institutions and so we may be justified in using the
term "militarisation".

On one hand, there is nothing too surprising about this. Ever since Napoleon changed
warfare from the dynastic duels of the eighteenth century to the total warfare with which
we are familiar in this century, war itself has come to rely on the complete mobilization of
a society's industrial and human resources. While the armies of Frederick the Great
were composed mostly of expensive mercenaries, who had to be carefully used in the
battlefield, the Napoleonic armies benefited from the invention of new institutional
means of converting the entire population of a country into a vast reservoir of human
resources. Although technically speaking the French revolution did not invent
compulsory military service, its institutional innovations did allow its leaders to perform
the first modern mass conscription, involving the conversion of all men into soldiers, and

http://www.t0.or.at/delanda/netwar.htm (1 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

of all women into cheap laborers. As the famous proclamation of 1793 reads:

"...all Frenchmen are permanently requisitioned for service into the armies. Young men
will go forth to battle; married men will forge weapons and transport munitions; women
will make tents and clothing and serve in hospitals; children will make lint from old linen;
and old men will be brought to the public squares to arouse the courage of the soldiers,
while preaching the unity of the Republic and hatred against Kings." {1}

This proclamation, and the vast bureaucratic machinery needed to enforce it, effectively
transformed the civilian population of France into a resource (for war, production,
motivation) to be tapped into at will by the military high command. A similar point applies
to the industrial, mineral and agricultural resources of France and many other nation
states. Given the complete mobilization of society's resources involved in total war it is
therefore not surprising that there has been a deepening of military involvement in
civilian society in the last two centuries. However, I would want to argue that, in addition
to the links between economic, political and military institutions brought about by war
time mobilizations, there are other links, which are older, subtler but for the same reason
more insidious, which represent a true militarisation of society during peace time. To
retire to the French example, some of the weapons that the Napoleonic armies used
were the product of a revolution in manufacturing techniques which took place in French
armories in the late eighteenth century. In French armories, the core concepts and
techniques of what later would become assembly-line, mass production techniques,
were for the first time developed. The ideal of creating weapons with perfectly
interchangeable parts, and ideal which could not be fulfilled without standardization and
routinization of production, was taken even further in American arsenals in the early 19th
century. And it was there that military engineers first realized that in practice,
standardization went hand in hand with replacement of flexible individual skills with rigid
collective routines, enforced through constant discipline and monitoring.

Even before that, in the Dutch armies of the sixteenth century, this process had already
begun. Civilians tend to think of Frederick Taylor, the late nineteenth century creator of
so-called "scientific management" techniques, as the pioneer of labor process analysis,
that is, the breaking down of a given factory practice into micro-movements and the
streamlining of these movements for greater efficiency and centralized management
control. But Dutch commander Maurice of Nassau had already applied these methods to
the training of his soldiers beginning in the 1560's. Maurice analyzed the motion needed
to load, aim and fire a weapon into its micro-movements, redesigned them for maximum
efficiency and then imposed them on his soldiers via continuous drill and discipline. {2}
Yet, while the soldiers increased their efficiency tremendously as a collective whole,
each individual soldier completely lost control of his actions in the battlefield. And a
similar point applies to the application of this idea to factory workers, before and after
Taylorism. Collectively they became more productive, generating the economies of

http://www.t0.or.at/delanda/netwar.htm (2 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

scale so characteristic of twenty-century big business, while simultaneously completely


losing control of their individual actions.

This is but one example of the idea of militarisation of society. Recent historians have
rediscovered several other cases of the military origins of what was once thought to be
civilian innovations. In recent times it has been Michel Foucault who has most forcefully
articulated this view. For him this intertwining of military and civilian institutions is
constitutive of the modern European nation-state. On one hand, the project of
nationbuilding was an integrative movement, forging bonds that went beyond the
primordial ties of family and locality, linking urban and rural populations under a new
social contract. On the other, and complementing this process of unification, there was
the less conscious project of uniformation, of submitting the new population of free
citizens to intense and continuous training, testing and exercise to yield a more or less
uniform mass of obedient individuals. In Foucault's own words:

"Historians of ideas usually attribute the dream of a perfect society to the philosophers
and jurists of the eighteenth century; but there was also a military dream of society; its
fundamental reference was not to the state of nature, but to the meticulously
subordinated cogs of a machine, not to the primal social contract, but to permanent
coercions, not to fundamental rights, but to indefinitely progressive forms of training, not
to the general will but to automatic docility... The Napoleonic regime was not far off and
with it the form of state that was to survive it and, we must not forget, the foundations of
which were laid not only by jurists, but also by soldiers, not only counselors of state, but
also junior officers, not only the men of the courts, but also the men of the camps. The
Roman reference that accompanied this formation certainly bears with it this double
index: citizens and legionnaires, law and maneuvers. While jurists or philosophers were
seeking in the pact a primal model for the construction or reconstruction of the social
body, the soldiers and with them the technicians of discipline were elaborating
procedures for the individual and collective coercion of bodies." {3}

Given that modern technology has evolved in such a world of interacting economic,
political and military institutions, it should not come as a surprise that the history of
computers, computer networks, Artificial Intelligence and other components of
contemporary technology, is so thoroughly intertwined with military history. Here, as
before, we must carefully distinguish those influences which occurred during war-time
from those that took place in peace-time, since the former can be easily dismissed as
involving the military simply as a catalyst or stimulant, that is, an accelerator of a
process that would have occurred more slowly without its direct influence. The computer
itself may be an example of indirect influence. The basic concept, as everyone knows,
originated in a most esoteric area of the civilian world. In the 1 930's British
mathematician Alan Turing created the basic concept of the computer in an attempt to
solve some highly abstract questions in metamathematics. But for that reason, the
Turing Machine, as his conceptual machine was called, was a long way from an actual,

http://www.t0.or.at/delanda/netwar.htm (3 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

working prototype. It was during World War 11, when Turing was mobilized as part of
the war effort to crack the Nazi's Enigma code, that, in the course of his intense
participation in that operation, he was exposed to some of the practical obstacles
blocking the way towards the creation of a real Turing Machine. On the other side of the
Atlantic, John Von Neuman also developed his own practical insights as to how to bring
the Turing Machine to life, in the course of his participation in the Manhattan Project and
other war related operations.

In this case we may easily dismiss the role that the military played, arguing that without
the intensification and concentration of effort brought about by the war, the computer
would have developed on its own, perhaps at a slower pace. And I agree that this is
correct. On the other hand, many of the uses to which computers were put after the war
illustrate the other side of the story: a direct participation of military institutions in the
development of technology, a participation which actually shaped this technology in the
direction of uniformization, routinization and concentration of control. Perhaps the best
example of this other relation between the military and technology is the systems of
machine-part production known as Numerical Control methods. While the methods
developed in 19th. century arsenals, and later transferred to civilian enterprises, had
already increased uniformity and centralized control in the production of large quantities
of the same object (that is, mass production), this had left untouched those areas of
production which create relatively small batches of complex machine parts. Here the
skills of the machinist were still indispensable as late as World War II. During the 1950's,
the Air Force underwrote not only the research and development of a new system to get
rid of the machinist's skills, but also the development of software, the actual purchase of
machinery by contractors, and the training of operators and programmers. In a
contemporary Numerical Control system, after the engineer draws the parts that need to
be produced, the drawings themselves are converted into data and stored in cards or
electronically. From then on, all the operations needed to be performed, drilling, milling,
lathing, boring, and so on, are performed automatically by computer-controlled
machines. Unlike mass-production techniques, where this automatism was achieved at
the expense of flexibility, in Numerical Control systems a relatively simple change in
software (not hardware) is all that is needed to adapt the system for the production of a
new set of parts. Yet, the effects on the population of workers were very similar in both
cases: the replacement of flexible skills by rigid commands embodied in hardware or
software, and over time, the loss of those skills leading to a general process of worker
de-skilling, and consequently, to the loss of individual control of the production process.

The question in both cases is not the influence that the objects produced in militarized
factories may have on the civilian world. One could, for instance, argue that the support
of the canned food industry by Napoleon had a beneficial effect on society, and a similar
argument may be made for many objects developed under military influence. The
question, however, is not the transfer of objects, but the transfer of the production
processes behind those objects that matters, since these processes bring with them the

http://www.t0.or.at/delanda/netwar.htm (4 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

entire control and command structure of the military with them. To quote historian David
Noble:

"The command imperative entailed direct control of production operations not just with a
single machine or within a single plant, but worldwide, via data links. The vision of the
architects of the [Numerical Control] revolution entailed much more than the automatic
machining of complex parts; it meant the elimination of human intervention -a shortening
of the chain of command - and the reduction of remaining people to unskilled, routine,
and closely regulated tasks." And he adds that Numerical Control is a "giant step in the
same direction [as the 19th. century drive for uniformity]; here management has the
capacity to bypass the worker and communicate directly to the machine via tapes or
direct computer link. The machine itself can thereafter pace and discipline the
worker." {4}

Let's pause for a moment and consider a possible objection to this analysis. One may
argue that the goal of withdrawing control from workers and transferring it to machines is
the essence of the capitalist system and that, if military institutions happened to be
involved, they did so by playing the role assigned to them by the capitalist system. The
problem with this reply is that, although it may satisfy a convinced Marxist, it is at odds
with much historical data gathered by this century's best economic historians. This data
shows that European societies, far from having evolved through a unilinear progression
of "modes of production" (feudalism, capitalism, socialism), actually exhibited a much
more complex, more heterogeneous coexistence of processes. In other words, as
historian Ferdinand Braudel has shown, as far back as the fourteenth and fifteenth
centuries, institutions with the capability of exercising economic power (large banks,
wholesalers, long-distance trade companies) were already in operation. and fully
coexisted with feudal institutions as well as with economic institutions that did no have
economic power, such as retailers and producers of humble goods. Indeed, Braudel
shows that these complex coexistances of institutions of different types existed before
and after the Industrial Revolution, and suggests that the concept of a "capitalist
system" (where every aspect of society is connected into a functional whole) gives a
misleading picture of the real processes. What I am suggesting here is that we take
Braudel seriously, forget about our picture of history as divided into neat, internally
homogeneous eras or ages, and tackle the complex combinations of institutions
involved in real historical processes.

The models we create of these complex "institutional ecologies" should include military
organizations playing a large, relatively independent role, to reflect the historical data we
now have on several important cases, like fifteenth century Venice, whose famous
Arsenal was at the time the largest industrial complex in Europe, or at eighteenth
century France and nineteenth century United States, and their military standardization
of weapon production. Another important example, involves the development of the
modern corporation, particularly as it happened in the United States in the last century.

http://www.t0.or.at/delanda/netwar.htm (5 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

The first American big business was the railroad industry, which developed the
management techniques which many other large enterprises would adopt later on. This
much is well known. What is not so well known is that military engineers were deeply
involved in the creation of the first railroads and that they developed many of the
features of management which later on came to characterize just about every large
commercial enterprise in the United States, Europe and elsewhere. In the words of
historian Charles O'Connell:

"As the railroads evolved and expanded, they began to exhibit structural and procedural
characteristics that bore a remarkable resemblance to those of the Army. Both
organizations erected complicated management hierarchies to coordinate and control a
variety of functionally diverse, geographically separated corporate activities. Both
created specialized staff bureaus to provide a range of technical and logistical support
services. Both divided corporate authority and responsibility between line and staff
agencies and officers and then adopted elaborate written regulations that codified the
relationship between them. Both established formal guidelines to govern routine
activities and instituted standardized reporting and accounting procedures and forms to
provide corporate headquarters with detailed financial and operational information which
flowed along carefully defined lines of communication. As the railroads assumed these
characteristics, they became America's first 'big business'." {5}

Thus, the transfer of military practices to the civilian world influenced the lives not only of
workers, but of the managers themselves. And the influence did not stop with the
development of railroads. The "management science" which is today taught in business
schools is a development of military "operations research", a discipline created during
World War 11 to tackle a variety of tactical, strategic and logistic problems. And it was
the combination of this "science of centralization" and the availability of large computers
that, in turn, allowed the proliferation of transnational corporations and the consequent
internationalization of the standardization and routinization of production processes.
Much as skills were replaced by commands in the shop floor, so were prices replaced by
commands at the management level. (This is one reason not to use the term "markets"
when theorizing big business. Not only they rely on commands instead of prices, they
manipulate demand and supply rather than being governed by them. Hence, Braudel
has suggested calling big business "anti-markets"). {6}

Keeping in mind the actual complexity of historical processes, as opposed to explaining


everything by the "laws of capitalist development", is crucial not only to understand the
past, but also to intervene in the present and speculate about the future. This is
particularly clear when analyzing the role which computers and computer networks may
play in the shaping of the economic world in the coming century. It is easy to attribute
many of the problems we have today, particularly those related to centralized
surveillance and control, to computer technology. But to do this would not only artificially

http://www.t0.or.at/delanda/netwar.htm (6 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

homogenize the history of computers (there are large differences between the
development of mainframes and minicomputers, on one hand, and the personal
computer, on the other) but it would obscure the fact that, if computers have come to
play the "disciplinarian" roles they play today it is as part of a historical processes which
is several centuries old, a process which computers have only intensified.

Another advantage of confronting the actual heterogeneity of historical processes, and


of throwing to the garbage the concept of "the capitalist system", is that we free
ourselves to look around for combinations of economic institutions which coexist with
disciplinarian anti-markets but do not play by the same rules. Historically, as Braudel
has shown, economic power since the 14th century has always been associated with
large size enterprises and their associated "economies of scale". Although technically
this term only applies to mass-produced objects, economies of scale meaning the
spreading of production costs among many identical products, we may use it in an
extended way to define any economic benefits to managers, merchants and financiers
stemming from the scale of any economic resource. Coexisting with economies of scale
there are what is called "economies of agglomeration". These are economic benefits
which small businesses enjoy from the concentration of many of them in a large city.
These economies stem from the benefits of shop-talk, from unplanned connections and
mutual enhancements, as well as for the services which grow around these
concentrations, services which small business could not afford on their own.

To conclude this talk I would like to give one example, from the world of computers, of
two American industrial hinterlands which illustrate the difference between economies of
scale and of agglomeration: Silicon Valley in Northern California, and Route 128 near
Boston:

"Silicon Valley has a decentralized industrial system that is organized around regional
networks. Like firms in Japan, and parts of Germany and Italy, Silicon Valley companies
tend to draw on local knowledge and relationships to create new markets, products, and
applications. These specialist firms compete intensely while at the same time learning
from one another about changing markets and technologies. The region's dense social
networks and open labor markets encourage experimentation and entrepreneurship.
The boundaries within firms are porous, as are those between firms themselves and
between firms and local institutions such as trade associations and universities." {7}

The growth of this region owed very little to large financial flows from governmental and
military institutions. Silicon Valley did not develop so much by economies of scale, as by
the benefits derived from an agglomeration of visionary engineers, specialist consultants
and financial entrepreneurs. Engineers moved often from one firm to another,
developing loyalties to the craft and region's networks, not to the corporation. This
constant migration, plus an unusual practice of information sharing among the local

http://www.t0.or.at/delanda/netwar.htm (7 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

producers, insured that new formal and informal knowledge diffused rapidly through the
entire region. Business associations fostered collaboration between small and medium-
sized companies. Risk-taking and innovation were preferred to stability and routinization.
This, of course, does not mean that there were not large, routinized firms in Silicon
Valley, only that they did not dominate the mix. Not so in Route 128:

"While Silicon Valley producers of the 1970's were embedded in, and inseparable from,
intricate social and technical networks, the Route 128 region came to be dominated by a
small number of highly self-sufficient corporations. Consonant with New England's two
century old manufacturing tradition, Route 128 firms sought to preserve their
independence by internalizing a wide range of activities. As a result, secrecy and
corporate loyalty govern relations between firms and their customers, suppliers, and
competitors, reinforcing a regional culture of stability and self-reliance. Corporate
hierarchies insured that authority remains centralized and information flows vertically.
The boundaries between and within firms and between firms and local institutions thus
remain far more distinct." {8}

While before the recession of the 1980's both regions had been continuously expanding,
one on economies of scale and the other on economies of agglomeration (or rather,
mixtures dominated by one or the other), they both felt the full impact of the downturn. At
that point some large Silicon Valley firms, unaware of the dynamics behind the region's
success, began to switch to economies of scale, sending parts of their production to
other areas, and internalizing activities previously performed by smaller firms. Yet, unlike
Route 128, the intensification of routinization and internalization in Silicon Valley was not
a constitutive part of the region, which meant that the old meshwork system could be
revived. And this is, in fact, what happened. Silicon Valley's regional networks were re-
energized, through the birth of new firms in the old pattern, and the region has now
returned to its former dynamic state, unlike the command-heavy Route 128 which
continues to stagnate. What this shows is that, while both scale and agglomeration
economies, as forms of positive feedback, promote growth, only the latter endows firms
with the flexibility needed to cope with adverse economic conditions.

In conclusion I would like to repeat my call for more realistic models of economic history,
models involving the full complexity of the institutional ecologies involved, including
markets, anti-markets, military and bureaucratic institutions, and if we are to believe
Michel Foucault, schools, hospitals, prisons and many others. It is only through an
honest philosophical confrontation with our complex past that we can expect to
understand it and derive the lessons we may use when intervening in the present and
speculating about the future.

References:

http://www.t0.or.at/delanda/netwar.htm (8 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

{1} Excerpt from the text of the levee en mass of 1793, quoted in William H. McNeill. The
Pursuit of Power. Technology, Armed Force and Society since A.D. 1000. (University of
Chicago Press, 1982). p. 192

{2} ibid. p. 129

{3} Michel Foucault. Discipline and Punish. The Birth of Prison. (Vintage Books, New
York, 1979) p. 169

{4} David Noble. Command Performance: A Perspective on Military Enterprise and


Technological Change. In Merrit Roe Smith ed. Military Enterprise. (MIT Press, 1987). p.
341 and 342.

{5} Charles F. O'Connell, Jr. The Corps of Engineers and the Rise of Modern
Management. In ibid. p. 88

{6} Fernand Braudel. The Wheels of Commerce. (Harper and Row, New York, 1986).
p.379

{7} Annalee Saxenian. Lessons from Silicon Valley. In Technology Review, Vol. 97, no.
5. page. 44

{8} ibid. p. 47

http://www.t0.or.at/delanda/netwar.htm (9 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, ECONOMICS, COMPUTERS AND THE WAR MACHINE

http://www.t0.or.at/delanda/netwar.htm (10 z 10)2006-02-23 23:41:53


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

MARKETS AND ANTIMARKETS


IN THE WORLD ECONOMY

by Manuel De Landa.

One of the most significant epistemological events in recent years is the


growing importance of historical questions in the ongoing
reconceptualization of the hard sciences. I believe it is not an exaggeration
to say that in the last two or three decades, history has almost completely
infiltrated physics, chemistry and biology. It is true that nineteenth century
thermodynamics had already introduced an arrow of time into physics, and
hence the idea of irreversible historical processes. It is also true that the
theory of evolution had already shown that animals and plants were not
embodiments of eternal essences but piecemeal historical constructions,
slow accumulations of adaptive traits cemented together via reproductive
isolation. However, the classical versions of these two theories incorporated
a rather weak notion of history into their conceptual machinery: both
thermodynamics and Darwinism admitted only one possible historical
outcome, the reaching of thermal equilibrium or of the fittest design. In both
cases, once this point was reached, historical processes ceased to count.
For these theories, optimal design or optimal distribution of energy
represented, in a sense, an end of history.

Hence, it should come as no surprise that the current penetration of science


by history has been the result of advances in these two disciplines. Ilya
Prigogine revolutionized thermodynamics in the 1960's by showing that the
classical results were only valid for closed systems where the overall
amounts of energy are always conserved. If one allows energy to flow in and
out of a system, the number and type of possible historical outcomes greatly

http://www.t0.or.at/delanda/a-market.htm (1 z 17)2006-02-23 23:42:01


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

increases. Instead of a unique and simple equilibrium, we now have multiple


ones of varying complexity (static, periodic and chaotic attractors); and
moreover, when a system switches from one to another form of stability (at a
so-called bifurcation), minor fluctuations can be crucial in deciding the actual
form of the outcome. Hence, when we study a given physical system, we
need to know the specific nature of the fluctuations that have been present
at each of its bifurcations, in other words, we need to know its exact history
to understand its current dynamical form. {1}

And what is true of physical systems is all the more so for biological ones.
Attractors and bifurcations are features of any system in which the dynamics
are nonlinear, that is, in which there are strong interactions between
variables. As biology begins to include these nonlinear dynamical
phenomena in its models, for example, in the case of evolutionary arms-
races between predators and prey, the notion of a "fittest design" loses its
meaning. In an arms-race there is no optimal solution fixed once and for all,
since the criterion of fitness itself changes with the dynamics. This is also
true for any adaptive trait which value depends on how frequent it occurs in
a given population, as well as in cases like migration, where animal behavior
interacts nonlinearly with selection pressures. As the belief in a fixed
criterion of optimality disappears from biology, real historical processes
come to reassert themselves once more. {2}

Computers have played a crucial role in this process of infiltration. The


nonlinear equations that go into these new historical models cannot be
solved by analytical methods alone, and so scientists need computers to
perform numerical simulations and discover the behavior of the solutions.
But perhaps the most crucial role of digital technology has been to allow a
switch from a purely analytic, top-down style of modeling, to a more
synthetic, bottom-up approach. In the growing discipline of Artificial Life, for
instance, an ecosystem is not modeled starting from the whole and
dissecting it into its component parts, but the other way around: one begins
at the bottom, with a population of virtual animals and plants and their local
interactions, and the ecosystem needs to emerge spontaneously from these
local dynamics. The basic idea is that the systematic properties of an
ecosystem arise from the interactions between its animal and plant
components, so that when one dissects the whole into parts the first thing

http://www.t0.or.at/delanda/a-market.htm (2 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

we lose is any property due to these interactions. Analytical techniques, by


their very nature, tend to kill emergent properties, that is, properties of the
whole that are more than the sum of its parts. Hence the need for a more
synthetic approach, in which everything systematic about a given whole is
modeled as a historically emergent result of local interactions. {3}

These new ideas are all the more important when we move on to the social
sciences, particularly economics. In this discipline, we tend to uncritically
assume systematicity, as when one talks of the "capitalist system", instead
of showing exactly how such systematic properties of the whole emerge
from concrete historical processes. Worse yet, we then tend to reify such
unaccounted-for systematicity, ascribing all kinds of causal powers to
capitalism, to the extent that a clever writer can make it seem as if anything
at all (from nonlinear dynamics itself to postmodernism or cyberculture) is
the product of late capitalism. This basic mistake, which is, I believe, a major
obstacle to a correct understanding of the nature of economic power, is
partly the result of the purely top-down, analytical style that has dominated
economic modeling from the eighteenth century. Both macroeconomics,
which begins at the top with concepts like gross national product, as well as
microeconomics, in which a system of preferences guides individual choice,
are purely analytical in approach. Neither the properties of a national
economy nor the ranked preferences of consumers are shown to emerge
from historical dynamics. Marxism, is true, added to these models
intermediate scale phenomena, like class struggle, and with it conflictive
dynamics. But the specific way in which it introduced conflict, via the labor
theory of value, has now been shown by Shraffa to be redundant, added
from the top, so to speak, and not emerging from the bottom, from real
struggles over wages, or the length of the working day, or for control over
the production process. {4}

Besides a switch to a synthetic approach, as it is happening, for instance, in


the evolutionary economics of Nelson and Winter in which the emphasis is
on populations of organizations interacting nonlinearly, what we need here is
a return to the actual details of economic history. Much has been learned in
recent decades about these details, thanks to the work of materialist
historians like Fernand Braudel, and it is to this historical data that we must
turn to know what we need to model synthetically. Nowhere is this need for

http://www.t0.or.at/delanda/a-market.htm (3 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

real history more evident that in the subject of the dynamics of economic
power, defined as the capability to manipulate the prices of inputs and
outputs of the production process as well as their supply and demand. In a
peasant market, or even in a small town local market, everybody involved is
a price taker: one shows up with merchandise, and sells it at the going
prices which reflect demand and supply. But monopolies and oligopolies are
price setters: the prices of their products need not reflect demand/supply
dynamics, but rather their own power to control a given market share. {5}

When approaching the subject of economic power, one can safely ignore
the entire field of linear mathematical economics (so-called competitive
equilibrium economics), since there monopolies and oligopolies are basically
ignored. Yet, even those thinkers who make economic power the center of
their models, introduce it in a way that ignores historical facts. Authors
writing in the Marxist tradition, place real history in a straight-jacket by
subordinating it to a model of a progressive succession of modes of
production. Capitalism itself is seen as maturing through a series of stages,
the latest one of which is the monopolistic stage in this century. Even non-
Marxists economists like Galbraith, agree that capitalism began as a
competitive pursuit and stayed that way till the end of the nineteenth
century, and only then it reached the monopolistic stage, at which point a
planning system replaced market dynamics.

However, Fernand Braudel has recently shown, with a wealth of historical


data, that this picture is inherently wrong. Capitalism was, from its
beginnings in the Italy of the thirteenth century, always monopolistic and
oligopolistic. That is to say, the power of capitalism has always been
associated with large enterprises, large that is, relative to the size of the
markets where they operate. {6} Also, it has always been associated with
the ability to plan economic strategies and to control market dynamics, and
therefore, with a certain degree of centralization and hierarchy. Within the
limits of this presentation, I will not be able to review the historical evidence
that supports this extremely important hypothesis, but allow me at least to
extract some of the consequences that would follow if it turns out to be true.

First of all, if capitalism has always relied on non-competitive practices, if the


prices for its commodities have never been objectively set by demand/

http://www.t0.or.at/delanda/a-market.htm (4 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

supply dynamics, but imposed from above by powerful economic decision-


makers, then capitalism and the market have always been different entities.
To use a term introduced by Braudel, capitalism has always been an
"antimarket". This, of course, would seem to go against the very meaning of
the word "capitalism", regardless of whether the word is used by Karl Marx
or Ronald Reagan. For both nineteenth century radicals and twentieth
century conservatives, capitalism is identified with an economy driven by
market forces, whether one finds this desirable or not. Today, for example,
one speaks of the former Soviet Union's "transition to a market economy",
even though what was really supposed to happen was a transition to an
antimarket: to large scale enterprises, with several layers of managerial
strata, in which prices are set not taken. This conceptual confusion is so
entrenched that I believe the only solution is to abandon the term
"capitalism" completely, and to begin speaking of markets and antimarkets
and their dynamics.

This would have the added advantage that it would allow us to get rid of
historical theories framed in terms of stages of progress, and to recognize
the fact that antimarkets could have arisen anywhere, not just Europe, the
moment the flows of goods through markets reach a certain critical level of
intensity, so that organizations bent on manipulating these flows can
emerge. Hence, the birth of antimarkets in Europe has absolutely nothing to
do with a peculiarly European trait, such as rationality or a religious ethic of
thrift. As is well known today, Europe borrowed most of its economic and
accounting techniques, those techniques that are supposed to distinguish
her as uniquely rational, from Islam. {8}

Finally, and before we take a look at what a synthetic, bottom-up approach


to the study of economic dynamics would be like, let me meet a possible
objection to these remarks: the idea that "real" capitalism did not emerge till
the nineteenth century industrial revolution, and hence that it could not have
arisen anywhere else where these specific conditions did not exist. To
criticize this position, Fernand Braudel has also shown that the idea that
capitalism goes through stages, first commercial, then industrial and finally
financial, is not supported by the available historical evidence. Venice in the
fourteenth century and Amsterdam in the seventeenth, to cite only two
examples, already show the coexistance of the three modes of capital in

http://www.t0.or.at/delanda/a-market.htm (5 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

interaction. Moreover, other historians have recently shown that that specific
form of industrial production which we tend to identify as "truly capitalist",
that is, assembly-line mass production, was not born in economic
organizations, but in military ones, beginning in France in the eighteenth
century, and then in the United States in the nineteenth. It was military
arsenals and armories that gave birth to these particularly oppressive control
techniques of the production process, at least a hundred years before Henry
Ford and his Model-T cars {10} Hence, the large firms that make up the
antimarket, can be seen as replicators, much as animals and plants are.
And in populations of such replicators we should be able to observe the
emergence of the different commercial forms, from the family firm, to the
limited liability partnership to the joint stock company. These three forms,
which had already emerged by the fifteenth century, must be seen as
arising, like those of animals and plants, from slow accumulations of traits
which later become consolidated into more or less permanent structures,
and not, of course, as the manifestation of some pre-existing essence. In
short, both animal and plant species as well as "institutional species" are
historical constructions, the emergence of which bottom-up models can help
us study.

It is important to emphasize that we are not dealing with biological


metaphors here. Any kind of replicating system which produces variable
copies of itself, coupled with any kind of sorting device, is capable of
evolving new forms. This basic insight is now exploited technologically in the
so-called "genetic algorithm", which allows programmers to breed computer
software instead of painstakingly coding it by hand. A population of
computer programs is allowed to reproduce with some variation, and the
programmer plays the role of sorting device, steering the population towards
the desired form. The same idea is what makes Artificial Life projects work.
Hence, when we say that the forms the antimarket has taken are evolved
historical constructions we do not mean to say that they are metaphorically
like organic forms, but that they are produced by a process which embodies
the same engineering diagram as the one which generates organic forms.
Another example may help clarify this. When one says, as leftists used to
say, that "class-struggle is the motor of history", one is using the word
"motor" in a metaphorical way. On the other hand, to say that a hurricane is
a steam motor is not to use the term metaphorically, but literally: one is

http://www.t0.or.at/delanda/a-market.htm (6 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

saying that the hurricane embodies the same engineering diagram as a


steam motor: it uses a reservoir of heat and operates via differences of
temperature circulated through a Carnot cycle. The same is true of the
genetic algorithm. Anything that replicates, such as patterns of behavior
transmitted by imitation, or rules and norms transmitted by enforced
repetition can give rise to novel forms, when populations of them are
subjected to selection pressures. And the traits that are thus accumulated
can become consolidated into a permanent structure by codification, as
when informal routines become written rules. {11}

In this case, we have the diagram of a process which generates hierarchical


structures, whether large institutions rigidly controlled by their rules or
organic structures rigidly controlled by their genes. There are, however,
other structure-generating processes which result in decentralized
assemblages of heterogeneous components. Unlike a species, an
ecosystem is not controlled by a genetic program: it integrates a variety of
animals and plants in a food web, interlocking them together into what has
been called a "meshwork structure". The dynamics of such meshworks are
currently under intense investigation and something like their abstract
diagram is beginning to emerge. {12} From this research, it is becoming
increasingly clear that small markets, that is, local markets without too many
middlemen, embody this diagram: they allow the assemblage of human
beings by interlocking complementary demands. These markets are indeed,
self-organized decentralized structures: they arise spontaneously without
the need for central planning. As dynamic entities they have absolutely
nothing to do with an "invisible hand", since models based on Adam Smith's
concept operate in a frictionless environment in which agents have perfect
rationality and all information flows freely. Yet, by eliminating nonlinearities,
these models preclude the spontaneous emergence of order, which
depends crucially on friction: delays, bottlenecks, imperfect decision-making
and so on.

The concept of a meshwork can be applied not only to the area of


exchange, but also to that of industrial production. Jane Jacobs has created
a theory of the dynamics of networks of small producers meshed together by
their interdependent functions, and has collected some historical evidence to
support her claims. The basic idea is that certain relatively backward cities in

http://www.t0.or.at/delanda/a-market.htm (7 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

the past, Venice when it was still subordinated to Byzantium, or the network
New York-Boston-Philadelphia when still a supply zone for the British
empire, engage in what she calls, import-substitution dynamics. Because of
their subordinated position, they must import most manufactured products,
and export raw materials. Yet, meshworks of small producers within the city,
by interlocking their skills can begin to replace those imports with local
production, which can then be exchanged with other backward cities. In the
process, new skills and new knowledge is generated, new products begin to
be imported, which in turn, become the raw materials for a new round of
import-substitution. Nonlinear computer simulations have been created of
this process, and they confirm Jacobs' intuition: a growing meshwork of
skills is a necessary condition for urban morphodynamics. The meshwork as
a whole is decentralized, and it does not grow by planning, but by a kind of
creative drift. {13}

Of course, this dichotomy between command hierarchies and meshworks


should not be taken too rigidly: in reality, once a market grows beyond a
certain size, it spontaneously generates a hierarchy of exchange, with
prestige goods at the top and elementary goods, like food, at the bottom.
Command structures, in turn, generate meshworks, as when hierarchical
organizations created the automobile and then a meshwork of services
(repair shops, gas stations, motels and so on), grew around it. {14} More
importantly, one should not romantically identify meshworks with that which
is "desirable" or "revolutionary", since there are situations when they
increase the power of hierarchies. For instance, oligopolistic competition
between large firms is sometimes kept away from price wars by the system
of interlocking directorates, in which representatives of large banks or
insurance companies sit in the boards of directors of these oligopolies. In
this case, a meshwork of hierarchies is almost equivalent to a monopoly.
{15} And yet, however complex the interaction between hierarchies and
meshworks, the distinction is real: the former create structures out of
elements sorted out into homogenous ranks, the latter articulates
heterogeneous elements as such, without homogenization. A bottom-up
approach to economic modeling should represent institutions as varying
mixtures of command and market components, perhaps in the form of
combinations of negative feedback loops, which are homogenizing, and
positive feedback, which generates heterogeneity.

http://www.t0.or.at/delanda/a-market.htm (8 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

What would one expect to emerge from such populations of more or less
centralized organizations and more or less decentralized markets? The
answer is, a world-economy, or a large zone of economic coherence. The
term, which should not be confused with that of a global economy, was
coined by Immanuel Wallerstein, and later adapted by Braudel so as not to
depend on a conception of history in terms of a unilineal progression of
modes of production. From Wallerstein Braudel takes the spatial definition of
a world-economy: an economically autonomous portion of the planet,
perhaps coexisting with other such regions, with a definite geographical
structure: a core of cities which dominate it, surrounded by yet other
economically active cities subordinated to the core and forming a middle
zone, and finally a periphery of completely exploited supply zones. The role
of core of the European world-economy has been historically played by
several cities: first Venice in the fourteenth century, followed by Antwerp and
Genoa in the fifteenth and sixteenth. Amsterdam then dominated it for the
next two centuries, followed by London and then New York. Today, we may
be witnessing the end of American supremacy and the role of core seems to
be moving to Tokyo. {16}

Interestingly, those cities which play the role of core, seem to generate in
their populations of firms, very few large ones. For instance, when Venice
played this role, no large organizations emerged in it, even though they
already existed in nearby Florence. Does this contradict the thesis that
capitalism has always been monopolistic? I think not. What happens is that,
in this case, Venice as a whole played the role of a monopoly: it completely
controlled access to the spice and luxury markets in the Levant. Within
Venice, everything seemed like "free competition", and yet its rich merchants
enjoyed tremendous advantages over any foreign rival, whatever its size.
Perhaps this can help explain the impression classical economists had of a
competitive stage of capitalism: when the Dutch or the British advocated
"free competition" internally is precisely when their cities as a whole held a
virtual monopoly on world trade.

World-economies, then, present a pattern of concentric circles around a


center, defined by relations of subordination. Besides this spatial structure,
Wallerstein and Braudel add a temporal one: a world-economy expands and
contracts in a variety of rhythms of different lengths: from short term

http://www.t0.or.at/delanda/a-market.htm (9 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

business cycles to longer term Kondratiev cycles which last approximately


fifty years. While the domination by core cities gives a world-economy its
spatial unity, these cycles give it a temporal coherence: prices and wages
move in unison over the entire area. Prices are, of course, much higher at
the center than at the periphery, and this makes everything flow towards the
core: Venice, Amsterdam, London and New York, as they took their turn as
dominant centers, became "universal warehouses" where one could find any
product from anywhere in the world. And yet, while respecting these
differences, all prices moved up and down following these nonlinear
rhythms, affecting even those firms belonging to the antimarket, which
needed to consider those fluctuations when setting their own prices.

These self-organized patterns in time and space which define world-


economies were first discovered in analytical studies of historical data. The
next step is to use synthetic techniques and create the conditions under
which they can emerge in our models. In fact, bottom-up computer
simulations of urban economics where spatial and temporal patterns
spontaneously emerge already exist. For example, Peter Allen has created
simulations of nonlinear urban dynamics as meshworks of interdependent
economic functions. Unlike earlier mathematical models of the distribution of
urban centers, which assumed perfect rationality on the part of economic
agents, and where spatial patterns resulted from the optimal use of some
resource such as transportation, here patterns emerge from a dynamic of
conflict and cooperation. As the flows of goods, services and people in and
out of these cities change, some urban centers grow while others decay.
Stable patterns of coexisting centers arise as bifurcations occur in the
growing city networks taking them from attractor to attractor. {17}

Something like Allen's approach would be useful to model one of the two
things that stitch world-economies together, according to Braudel: trade
circuits. However, to generate the actual spatial patterns that we observe in
the history of Europe, we need to include the creation of chains of
subordination among these cities, of hierarchies of dependencies besides
the meshworks of interdependencies. This would need the inclusion of
monopolies and oligopolies, growing out of each cities meshworks of small
producers and traders. We would also need to model the extensive
networks of merchants and bankers with which dominant cities invaded their

http://www.t0.or.at/delanda/a-market.htm (10 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

surrounding urban centers, converting them into a middle zone at the


service of the core. A dynamical system of trade circuits, animated by import-
substitution dynamics within each city, and networks of merchants extending
the reach of large firms of each city, may be able to give us some insight
into the real historical dynamics of the European economy. {18}

Bottom-up economic models which generate temporal patterns have also


been created. One of the most complex simulations in this area is the
Systems Dynamics National Model at MIT. Unlike econometric simulations,
where one begins at the macroeconomic level, this one is built up from the
operating structure within corporations. Production processes within each
industrial sector are modeled in detail. The decision-making behind price
setting, for instance, is modeled using the know-how from real managers.
The model includes many nonlinearities normally dismissed in classical
economic models, like delays, bottlenecks and the inevitable friction due to
bounded rationality. The simulation was not created with the purpose of
confirming the existence of the Kondratiev wave, the fifty-two year cycle that
can be observed in the history of wholesale prices for at least two centuries.
In fact, the designers of the model were unaware of the literature on the
subject. Yet, when the simulation began to unfold, it reached a bifurcation
and a periodic attractor emerged in the system, which began pulsing to a
fifty year beat. The crucial element in this dynamics seems to be the capital
goods sector, the part of the industry that creates the machines that the rest
of the economy uses. Whenever an intense rise in global demand occurs,
firms need to expand and so need to order new machines. But when the
capital goods sector in turn expands to meet this demand it needs to order
from itself. This creates a positive feedback loop that pushes the system
towards a bifurcation. {19}

Insights coming from running simulations like these can, in turn, be used to
build other simulations and to suggest directions for historical research to
follow. We can imagine parallel computers in the near future running
simulations combining all the insights from the ones we just discussed:
spatial networks of cities, breathing at different rhythms, and housing
evolving populations of organizations and meshworks of interdependent
skills. If power relations are included, monopolies and oligopolies will
emerge and we will be able to explore the genesis and evolution of the

http://www.t0.or.at/delanda/a-market.htm (11 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

antimarket. If we include the interactions between different forms of


organizations, then the relationships between economic and military
institutions may be studied. As Galbraith has pointed out, in today's
economy nothing goes against the market, nothing is a better representative
of the planning system, as he calls it, than the military-industrial complex.
{20} But we would be wrong in thinking that this is a modern phenomenon,
something caused by "late capitalism".

In the first core of the European world-economy, thirteenth century Venice,


the alliance between monopoly power and military might was already in
evidence. The Venetian arsenal, where all the merchant ships were built,
was the largest industrial complex of its time. We can think of these ships as
the fixed capital, the productive machinery of Venice, since they were used
to do all the trade that kept her powerful; but at the same time, they were
military machines used to enforce her monopolistic practices. {21} When the
turn of Amsterdam and London came to be the core, the famous Companies
of Indias with which they conquered the Asian world-economy, transforming
it into a periphery of Europe, were also hybrid military-economic institutions.
We have already mentioned the role that French armories and arsenals in
the eighteenth century, and American ones in the nineteenth, played in the
birth of mass production techniques. Frederick Taylor, the creator of the
modern system for the control the labor process, learned his craft in military
arsenals. That nineteenth century radical economists did not understand this
hybrid nature of the antimarket can be seen from the fact that Lenin himself
welcomed Taylorism into revolutionary Russia as a progressive force,
instead of seeing for what it was: the imposition of a rigid command-
hierarchy on the workplace. {22}

Unlike these thinkers, we should include in our simulations all the


institutional interactions that historians have uncovered, to correctly model
the hybrid economic-military structure of the antimarket. Perhaps by using
these synthetic models as tools of exploration, as intuition synthesizers, so
to speak, we will also be able to study the feasibility of counteracting the
growth of the antimarket by a proliferation of meshworks of small producers.
Multinational corporations, according to the influential theory of "transaction-
costs", grow by swallowing up meshworks, by internalizing markets either
through vertical or horizontal integration. {23} They can do this thanks to

http://www.t0.or.at/delanda/a-market.htm (12 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

their enormous economic power (most of them are oligopolies), and to their
having access to intense economies of scale. However, meshworks of small
producers interconnected via computer networks could have access to
different, yet as intense economies of scale. A well studied example is the
symbiotic collection of small textile firms that has emerged in an Italian
region between Bologna and Venice. The operation of a few centralized
textile corporations was broken down into a decentralized network of firms,
in which entrepreneurs replace managers and short runs of specialized
products replace large run of mass produced ones. Computer networks
allow these small firms to react flexibly to sudden shifts in demand, so that
no firm becomes overloaded while others sit idly with spare capacity. {24}

But more importantly, a growing pool of skills is thereby created, and


because this pool has not been internalized by a large corporation, it can not
be taken away. Hence this region will not suffer the fate of so many
American company towns, which die after the corporation that feeds them
moves elsewhere. This self-organized reservoirs of skills also explain why
economic development cannot be exported to the third world via large
transfers of capital invested in dams or other large structures. Economic
development must emerge from within as meshworks of skills grow and
proliferate. {25} Computer networks are an important element here, since
the savings in coordination costs that multinational corporations achieve by
internalizing markets, can be enjoyed by small firms through the use of
decentralizing technology. Computers may also help us to create a new
approach to control within these small firms. The management approach
used by large corporations was in fact developed during World War II under
the name of Operations Research. Much as mass production techniques
effected a transfer of a command hierarchy from military arsenals to civilian
factories, management practices based on linear analysis carry with them
the centralizing tendencies of the military institutions where they were born.
Fresh approaches to these questions are now under development by
nonlinear scientists, in which the role of managers is not to impose
preconceived plans on workers, but to catalyze the emergence of
meshworks of decision-making processes among them. {26} Computers, in
the form of embedded intelligence in the buildings that house small firms,
can aid this catalytic process, allowing the firm's members to reach some
measure of self-organization. Although these efforts are in their infancy, they

http://www.t0.or.at/delanda/a-market.htm (13 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

may one day play a crucial role in adding some heterogeneity to a world-
economy that's becoming increasingly homogenized.

FOOTNOTES:

{1} Ilya Prigogine and Isabelle Stengers. Order out of Chaos. (Bantam
Books, New York 1984). p.169.

{2} Stuart A. Kauffman. The Origins of Order. Self Organization and


Selection in Evolution. (Oxford Univ. Press, New York 1993) p.280

{3} Christopher G. Langton. Artificial Life. In C.G. Langton ed. Artificial Life.
(Addison-Wesley, 1989) p.2

{4} Geoff Hodgson. Critique of Wright 1: Labour and Profits. In Ian


Steedman ed. The Value Controversy. (Verso, London 1981). p.93

{5} John Keneth Galbraith. The New Industrial State. (Houghton Mifflin,
Boston 1978) p.24

{6} Fernand Braudel. Civilization and Capitalism, 15th-18th Century. Vol 2.


(Harper and Row, New York 1982) p.229

{7} ibid. p.559-561

{8} William H. McNeill. The Pursuit of Power. (University of Chicago Press,


1982) p.49

{9} Merrit Roe Smith. Army Ordnance and the "American system" of
Manufacturing, 1815-1861. In M.R.Smith ed. Military Enterprise and
Technological Change.

(MIT Press, 1987) p.47

{10} Richard Nelson and Sidney Winter. An Evolutionary Theory of


Economic Change. (Belknap Press, Cambridge Mass 1982) p.98

{11} Richard Dawkins. The Selfish Gene. (Oxford University Press, New

http://www.t0.or.at/delanda/a-market.htm (14 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

York 1989) ch.11

{12} Stuart Kauffman. The Evolution of Economic Webs. In Philip Anderson,


Kenneth Arrow and David Pines eds. The Economy as an Evolving Complex
System. (Addison-Wesley, 1988)

{13} Jane Jacobs. Cities and the Wealth of Nations. (Random House, New
York 1984) p.133

{14} The dichotomy Meshwork/Hierarchy is a special case of what Deleuze


and Guattari call Smooth/Striated or Rhizome/Tree.

Gilles Deleuze and Felix Guattari. 1440: The Smooth and the Striated. In A
Thousand Plateaus. (University of Minnesota Press, Minneapolis 1987)
ch.14

{15} John R. Munkirs and James I. Sturgeon. Oligopolistic Cooperation:


Conceptual and Empirical Evidence of Market Structure Evolution. In Marc.
R. Tool and Warren J. Samuels eds. The Economy as a System of Power.
(Transaction Press, New Brunswick 1989). p.343

{16} Fernand Braudel. op. cit. Vol 3. p.25-38

{17} Peter M. Allen. Self-Organization in the Urban System. In William C.


Schieve and P.M.Allen eds. Self-Organization and Dissipative Structures:
Applications in the Physical and the Social Sciences. (University of Texas,
Austin 1982) p.136

{18} Fernand Braudel. op. cit. Vol 3. p.140-167

{19} J.D. Sterman. Nonlinear Dynamics in the World Economy: the


Economic Long Wave. In Peter Christiansen and R.D. Parmentier eds.
Structure, Coherence and Chaos in Dynamical Systems. (Manchester Univ.
Press, Manchester 1989)

{20} John Galbraith. op. cit. p. 321

http://www.t0.or.at/delanda/a-market.htm (15 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

{21} Fernand Braudel. op. cit. Vol 2 p. 444

{22} Vladimir Lenin. The Immediate Tests of the Soviet Goverment.


Collected Works, Vol 27 (Moskow 1965).

{23} Jean-Francois Hennart. The Transaction Cost Theory of the


Multinational Enterprise. In Christos Pitelis and Roger Sudgen eds. The
Nature of the Transnational Firm. (Rutledge, London 1991).

{24} Thomas W. Malone and John F. Rockart. Computers, Networks and the
Corporation. In Scientific American Vol 265 Number 3 p.131

Also:

Jane Jacobs, op. cit. p.40

Fernand Braudel, op cit Vol 3 p. 630

{25} Jane Jacobs. op. cit. p.148

{26} F. Malik and G. Probst. Evolutionary Management.

In H.Ulrich and G. Probst eds. Self-Organization and Management of Social


Systems. (Springer Verlag, Berlin 1984) p. 113

http://www.t0.or.at/delanda/a-market.htm (16 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, MARKETS AND ANTIMARKETS IN THE WORLD ECONOMY

http://www.t0.or.at/delanda/a-market.htm (17 z 17)2006-02-23 23:42:02


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

THE GEOLOGY OF MORALS

A Neo-Materialist Interpretation

by Manuel De Landa.

The distinction between institutions which emerge from centralized and


decentralized decision-making by its human components has come to
occupy center-stage in several different contemporary philosophies.
Economist and Artificial Intelligence guru Herbert Simon, for example, views
bureaucracies and markets as the human institutions which best embody
these two conceptions of control. {1} Hierarchical institutions are the easiest
ones to analyze, since much of what happens within a bureaucracy in
planned by someone of higher rank, and the hierarchy as a whole has goals
and behaves in ways that are more or less consistent with those goals.
Markets, on the other hand, are tricky. Indeed, the term "market" needs to
be used with care because it has been greatly abused over the last century
by theorists on the left and the right. As Simon remarks, the term does not
refer to the world of corporations, whether monopolies or oligopolies, since
in these commercial institutions decision-making is highly centralized, and
prices are set by command.

I would indeed limit the sense of the term even more to refer exclusively to
those weakly gatherings of people at a predefined place in town, and not to
a dispersed set of consumers catered by a system of middleman (as when
one speaks of the "market" for personal computers). The reason is that, as
historian Fernand Braudel has made it clear, it is only in markets in the first
sense that we have any idea of what the dynamics of price formation are. In
other words, it is only in peasant and small town markets that decentralized
decision-making leads to prices setting themselves up in a way that we can

http://www.t0.or.at/delanda/geology.htm (1 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

understand. In any other type of market economists simply assume that


supply and demand connect to each other in a functional way, but they do
not give us any specific dynamics through which this connection is effected.
{2} Moreover, unlike the idealized version of markets guided by an "invisible
hand" to achieve an optimal allocation of resources, real markets are not in
any sense optimal. Indeed, like most decentralized, self-organized
structures, they are only viable, and since they are not hierarchical they
have no goals, and grow and develop mostly by drift.

Herbert Simonis distinction between command hierarchies and


decentralized markets may turn out to be a special case of a more general
dichotomy. In the view of philosophers Gilles Deleuze and Felix Guattari,
this more abstract classes, which they call strata and self-consistent
aggregates (or trees and rhizomes), are defined not so much by the locus of
control, as by the nature of the elements that are connected together. Strata
are composed of homogenous elements, whereas self-consistent
aggregates, or to use the term I prefer, meshworks, articulate
heterogeneous elements as such. {3} For example, a military hierarchy
allocates people into internally homogenous ranks before joining them
together through a chain of command. Markets, on the other hand, allow for
a set of heterogeneous needs and offers to become articulated through the
price mechanism, without reducing their diversity.

As both Simon and Deleuze and Guattari emphasize, the dichotomy


between hierarchies and markets, or more generally, between strata and
meshworks, should be understood in purely relative terms. In the first place,
in reality it is hard to find pure cases of these two structures: even the most
goal-oriented organization will still show some drift in its growth and
development, and most markets even in small towns contain some
hierarchical elements, even if it is just the local wholesaler which
manipulates prices by dumping (or withdrawing) large amounts of a product
on (or from) the market. Moreover, hierarchies give rise to meshworks and
meshworks to hierarchies. Thus, when several bureaucracies coexist
(governmental, academic, ecclesiastic), and in the absence of a super-
hierarchy to coordinate their interactions, the whole set of institutions will
tend to form a meshwork of hierarchies, articulated mostly through local and
temporary links. Similarly, as local markets grow in size, as in those gigantic

http://www.t0.or.at/delanda/geology.htm (2 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

fairs which have taken place periodically since the Middle Ages, they give
rise to commercial hierarchies, with a money market on top, a luxury goods
market underneath and, after several layers, a grain market at the bottom. A
real society, then, is made of complex and changing mixtures of these two
types of structure, and only in a few cases it will be easy to decide to what
type a given institution belongs.

The dichotomy between strata and meshworks can be usefully applied in a


wide variety of contexts. For instance, animal species may be considered
biological instantiations of a stratified structure while ecosystems may be
treated as meshworks. This raises the question of whether some (or most)
of the applications of these terms are purely metaphorical. There is, no
doubt, some element of metaphor in my use of the terms, but behind the
appearance of linguistic analogy there are, I believe, common physical
processes behind the formation of real meshworks and strata which make
all the different usages of the terms quite literal. These common processes
cannot be captured through linguistic representations alone and we need to
move to the realm of engineering diagrams to specify them.

Perhaps a concrete example will help clarify this rather crucial point. When
we say (as Marxists used to say) that "class struggle is the motor of history"
we are using the word "motor" in a purely metaphorical sense. However,
when say that "a hurricane is a steam motor" we are not simply making a
linguistic analogy: rather we are saying that hurricanes embody the same
diagram used by engineers to build steam motors, that is, that it contains a
reservoir of heat, that it operates via thermal differences and that it
circulates energy and materials through a (so-called) Carnot cycle. Deleuze
and Guattari use the term "abstract machine" to refer to this diagram shared
by very different physical assemblages. Thus, there would be an "abstract
motor" with different physical instantiations in technological objects and
natural atmospheric processes.

What I would like to argue here is that there are also abstract machines
behind the structure-generating processes which yield as historical products
specific meshworks and hierarchies. Let us begin by discussing the case of
hierarchical structures, and in particular, of social strata (classes, castes).
The term "social stratum" itself is clearly a metaphor, involving the idea that

http://www.t0.or.at/delanda/geology.htm (3 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

just as geological strata are layers of rocky materials stacked on top of each
other so classes and castes are like layers of human materials in which
some are higher and some lower. Is it possible to go beyond metaphor and
show that the genesis of both geological and social strata involve the same
engineering diagram?. Geological strata (accumulations of sedimentary
rocks like sandstone or limestone) are created through a process involving
(at least) two distinct operations. When one looks closely at the layers of
rock in an exposed mountain side, one striking characteristic is that each
layer contains further layers, each composed of small pebbles which are
nearly homogenous with respect to size, shape and chemical composition.
Since pebbles in nature do not come in standard sizes and shapes, some
kind of sorting mechanism needs to be involved here, some specific device
to take a multiplicity of pebbles of heterogeneous qualities and distribute
them into more or less uniform layers.

Geologists have uncovered one such mechanism: rivers acting as veritable


hydraulic computers (or at least, sorting machines). Rivers transport rocky
materials from their point of origin (a previously created mountain subject to
erosion or weathering) to the place in the ocean where these materials will
accumulate. In this process, pebbles of variable size, weight and shape tend
to react differently to the water transporting them. These different reactions
to moving water are what sorts the pebbles, with the small ones reaching
the ocean sooner than the large ones, for example. Once the raw materials
have been sorted out into more or less homogenous groupings deposited at
the bottom of the sea (that is, once they have become sedimented ), a
second operation is necessary to transform these loose collections of
pebbles into an entity of a higher scale: a sedimentary rock. This operation
consists in cementing the sorted components together into a new entity with
emergent properties of its own, that is, properties such as overall strength
and permeability that cannot be ascribed to the sum of the individual
pebbles. This second operation is carried out by certain substances
dissolved in water (such as silica or hematite in the case of sandstones)
which penetrate the sediment through the pores between pebbles. As this
percolating solution crystallizes, it consolidates the pebble's temporary
spatial relations into a more or less permanent "architectonic" structure. {4}

Thus, a double operation, a "double articulation" gets us from structures at

http://www.t0.or.at/delanda/geology.htm (4 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

one scale to structures at another scale. Deleuze and Guattari call these two
operations "content" and "expression" , and warn us against confusing them
with the old philosophical distinction between "substances" and "forms". The
reason is that each one of the two articulations involves substances and
forms: sedimentation is not just about accumulating pebbles (substance) but
also about sorting them into uniform layers (form); while consolidation not
only effects new architectonic couplings between pebbles (form) but also
yields a new entity, a sedimentary rock (substance). Moreover, these new
entities may themselves accumulate and sort (as in the alternating layers of
schist and sandstone that make up Alpine mountains) and become
consolidated when tectonic forces cause the accumulated layers of rock to
fold and become a higher scale entity, a mountain. {5}

In the model proposed by Deleuze and Guattari these two operations


constitute an engineering diagram and therefore we can expect to find this
"abstract machine of stratification" not only in the world of geology, but also
in the organic and human worlds. For example, according to neo-Darwinism
species form through the slow accumulation of genetic materials, and of the
adaptive anatomical and behavioral traits that those genetic materials yield
when combined with nonlinear dynamical processes (such as the interaction
of cells during the development of an embryo). Genes, of course, do not
merely deposit at random but are sorted out by a variety of selection
pressures which include climate, the action of predators and parasites and
the effects of male or female choice during mating. Thus, in a very real
sense, genetic materials "sediment" just as pebbles do, even if the nonlinear
dynamical system which performs the sorting operation is completely
different in detail. Furthermore, these loose collections of genes can (like
accumulated sand) be lost under some drastically changed conditions (such
as the onset of an Ice age) unless they become consolidated together. This
second operation is performed by "reproductive isolation": when a given
subset of a population becomes incapable of mating with the rest (or as in
the case of horses and donkeys, when their offspring are sterile).
Reproductive isolation acts as a "ratchet mechanism" which conserves the
accumulated adaptation and makes it impossible for a given population to
"de-evolve" all the way back to unicellular organisms. Through selective
accumulation and isolative consolidation, individual animals and plants
come to form a higher scale entity: a new species. {6}

http://www.t0.or.at/delanda/geology.htm (5 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

We can also find these two operations (and hence, this abstract diagram) in
the formation of social classes. We talk of "social strata" whenever a given
society presents a variety of differentiated roles to which not everyone has
equal access, and when a subset of those roles (i.e. those to which a ruling
elite alone has access) involves the control of key energetic and material
resources. While role differentiation may be a spontaneous effect of an
intensification in the flow of energy through society (e.g. as when a Big Man
in pre-State societies acts as an intensifier of agricultural production ), the
sorting of those roles into ranks along a scale of prestige involves specific
group dynamics. In one model, for instance, members of a group who have
acquired preferential access to some roles begin to acquire the power to
control further access to them, and within these dominant groups criteria for
sorting the rest of society into sub-groups begin to crystallize. "It is from
such crystallization of differential evaluation criteria and status positions that
some specific manifestations of stratification and status differences -such as
segregating the life-styles of different strata, the process of mobility between
them, the steepness of the stratificational hierarchies, some types of stratum
consciousness, as well as the degree and intensity of strata conflict-
develops in different societies." {7}

However, even though most societies develop some rankings of this type,
not in all of them do they become an autonomous dimension of social
organization. In many societies differentiation of the elites is not extensive
(they do not form a center while the rest of the population forms an excluded
periphery), surpluses do not accumulate (they may be destroyed in ritual
feasts), and primordial relations (of kin and local alliances) tend to prevail.
Hence a second operation is necessary beyond the mere sorting of people
into ranks for social classes or castes to become a separate entity: the
informal sorting criteria need to be given a theological interpretation and a
legal definition, and the elites need to become the guardians and bearers of
the newly institutionalized tradition, that is, the legitimizers of change and
delineators of the limits of innovation. In short, to transform a loose
accumulation of traditional roles (and criteria of access to those roles) into a
social class, the latter needs to become consolidated via theological and
legal codification. {8}

My main point can then be stated as follows: sedimentary rocks, species

http://www.t0.or.at/delanda/geology.htm (6 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

and social classes (and other institutionalized hierarchies) are all historical
constructions, the product of definite structure-generating processes which
take as their starting point a heterogeneous collection of raw materials
(pebbles, genes, roles), homogenize them through a sorting operation and
then give the resulting uniform groupings a more permanent state through
some form of consolidation. Hence, while some elements remain different (e.
g. only human institutions, and perhaps, biological species, involve a
hierarchy of command) others stay the same: the articulation of
homogenous components into higher-scale entities. (And all this, without
metaphor ).

What about meshworks?. Deleuze and Guattari also offer a hypothetical


diagram for this type of structure, but its elements are not as straightforward
as those involved in the formation of strata. For this reason I will begin the
description of this other abstract machine with some remarks about what
mathematical and computer models of meshworks have revealed about their
formation and behavior, and then attempt to derive their engineering
diagram. Perhaps the best studied type of meshwork is the so called
"autocatalytic loop", a closed chain of chemical processes involving not only
self-stimulation but also self-maintenance, that is, interconnecting a series of
mutually-stimulating pairs into a structure which reproduces as a whole: a
product that accumulates due to the catalytic acceleration of one chemical
reaction serves as the catalyst for yet another reaction which, in turn,
generates a product which catalyses the first one. Hence, the loop becomes
self-sustaining as long as its environment contains enough raw materials for
the chemical reactions to proceed.

Francisco Varela and Humberto Maturana, pioneers in the study of


autocatalytic loops (e.g. their theory of "autopoeisis") mention two general
characteristics of these closed circuits: they are dynamical systems which
endogenously generate their own stable states (called "attractors" or
"eigenstates"), and they grow and evolve by drift. . {9} An example of the
first characteristic are some chemical reactions involving autocatalysis (as
well as cross-catalysis) which function as veritable "chemical clocks", in
which the accumulation of materials from the reactions alternate each other
at perfectly regular intervals . This rhythmic behavior is not imposed to the
system from the outside but generated spontaneously from within (via an

http://www.t0.or.at/delanda/geology.htm (7 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

attractor). {10}

The second characteristic mentioned by Varela and Maturana, growth by


drift, may be explained as follows. In the simplest autocalytic loops there are
only two reactions, each producing a catalyst for the other. But once this
basic two-node network establishes itself, new nodes may insert themselves
into the mesh as long as they do not jeopardize its internal consistency.
Thus, a new chemical reaction may appear (using previously neglected raw
materials or even waste products from the original loop) which catalyses one
of the original ones and is catalyzed by the other, so that the loop now
becomes a three-node network. The meshwork has now grown but in a
direction which is, for all practical purposes, "unplanned". A new node
(which just happens to satisfy some internal consistency requirements) is
added and the loop complexifies, yet precisely because the only constraints
were internal, the complexification does not take place in order for the loop
as a whole to meet some external demand (such as adapting to a specific
situation). The surrounding environment, as source of raw materials,
certainly constraints the growth of the meshwork but more in a proscriptive
way (what not to do) than in a prescriptive one (what to do). {11}

The question now is whether from these and other empirical studies of
meshwork behavior we can derive a structure-generating process which is
abstract enough to operate in the worlds of geology, biology and human
society. In the model proposed by Deleuze and Guattari, there are three
elements in this diagram. First, a set of heterogeneous elements is brought
together via an articulation of superpositions , that is, an interconnection of
diverse but overlapping elements. (In the case of autocatalytic loops, the
nodes in the circuit are joined to each other by their functional
complementarities ). Second, a special class of operators, or intercallary
elements, is needed to effect this interlock via local connections. (In our
case, this is the role played by catalysts, inserting themselves between two
other chemical substances to facilitate their interaction). Finally, the
interlocked heterogeneities must be capable of endogenously generating
stable patterns of behavior (for example, patterns at regular temporal or
spatial intervals.) {12} Is it possible to find instances of these three elements
in all different spheres of reality?

http://www.t0.or.at/delanda/geology.htm (8 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

Besides the sedimentary type there exists another great class of rocks
called "igneous rocks" (such as granite) which are the outcome of a radically
different process of construction. Granite forms directly out of a cooling
magma, a viscous fluid made out of a diversity of molten materials. Each of
these liquid components has a different threshold of crystallization, that is,
each undergoes the bifurcation towards the solid state at a different critical
point in temperature. This means that as the magma cools down its different
elements will separate as they crystallize in sequence, those that solidify
earlier serving as containers for those which acquire a crystal form later. In
these circumstances the result is a complex set of heterogeneous crystals
which interlock with one another, and this is what gives granite its superior
strength. {13}

The second element in the diagram, intercallary elements, must be defined


more generally than just as catalytic substances, to include anything which
brings about local articulations from within, "densifications, intensifications,
reinforcements, injections, showerings , like so many intercallary events".
{14} The reactions between liquid magma and the walls of an already
crystallized component, nucleation events within the liquid which initiate the
next crystallization, and even certain "defects" inside the crystals (called
"dislocations") which promote growth from within, are all examples of
intercallary elements. Finally, chemical reactions within the magma may also
generate endogenous stable states.

When a reaction like the one involved in chemical clocks is not stirred, the
temporal intervals generated become spatial intervals, forming beautiful
spiral and concentric circle patterns which sometimes can be observed in
frozen form in some igneous rocks. {15}

Thus, granite (as much as a fully formed autocatalytic loop) is an instance of


a meshwork, or in the terms used by Deleuze and Guattari, of a self-
consistent aggregate. Unlike Varela and Maturana, for whom this quality of
self-consistency exists only in the biological and linguistic worlds, for
Deleuze and Guattari "consistency, far from being restricted to complex life
forms, fully pertains even to the most elementary atoms and particles". {16}
Therefore we may say that much as hierarchies (organic or social) are
special cases of a more abstract class, "strata", so autocatalytic loops are

http://www.t0.or.at/delanda/geology.htm (9 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

special cases of "self-consistent aggregates". And much as strata are


defined as an articulation of homogenous elements (and do not involve
more specific features of hierarchies such as having a chain of command),
so self-consistent aggregates are defined by their articulation of
heterogeneous elements and do not necessarily involve other, less general,
features (such as growth by drift or internal autonomy). Let us now give
some biological and cultural examples of the way in which the diverse may
be articulated as such via self-consistency.

As I just mentioned, a species (or more precisely, the gene pool of a


species) is a prime example of an organic stratified structure. Similarly, an
ecosystem represents a biological realization of a self-consistent aggregate.
While a species may be a very homogenous structure (specially if selection
pressures have driven many genes to fixation) an ecosystem links together
a wide variety of heterogeneous elements (animals and plants of different
species) which are articulated through interlock, that is, by their functional
complementarities. Given that the main feature of an ecosystem is the
circulation of energy and matter in the form of food, the complementarities in
question are alimentary: prey-predator or parasite-host are two of the most
common functional couplings that make up food webs. In this situation,
symbiotic relations can act as intercallary elements aiding the process of
building food webs, with the most obvious example being the bacteria that
live in the guts of many animals and which allow those animals to digest
their food. Since food webs also display endogenously generated stable
states, all three components of the abstract diagram seem to be realized in
this case.

I have already mentioned that markets may be considered examples of


cultural meshworks. In many cultures weekly markets have traditionally
been the meeting place for people with heterogeneous needs and offers.
Matching complementary demands (that is, interlocking these people
together by their needs and offers) is an operation which is performed
automatically by the price mechanism. (Prices transmit not only information
about the relative monetary value of different products, but also incentive to
buy and sell). All that is needed for this automatic mechanism to work is that
prices drop in the face of an excess supply and that quantities produced and
offered decline when prices are lowered.

http://www.t0.or.at/delanda/geology.htm (10 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

Of course, for this to work prices must set themselves, and therefore we
must imagine that there is not a wholesaler in town who can manipulate
prices by dumping (or hoarding) large amounts of a given product into the
market. In the absence of price manipulation, money (even primitive money
such as salt, shells or cigarettes) performs the function of intercallary
element: while with pure barter the possibility of two exactly matching
demands meeting by chance is very low, with money those chance
encounters become unnecessary, and complementary demands may find
each other at a distance, so to speak. Finally, markets also seem to
generate endogenous stable states, particularly when commercial towns
form trading circuits, as can be seen in the cyclic behavior of their prices.

Thus, much as sedimentary rocks, biological species and social hierarchies


are all particular cases of a stratified system (that is, they are all historical
products of a process of double articulation), so igneous rocks, ecosystems
and markets are self-consistent aggregates (or meshworks), the result of the
coming together and interlocking of heterogeneous elements. This
conception of very specific abstract machines governing a variety of
structure-generating processes not only blurs the distinction between the
natural and the artificial, but also that between the living and the inert. It
indeed points towards a new form of materialist philosophy in which raw
matter-energy through a variety of self-organizing processes and an intense
power of morphogenesis, generates all the structures that surround us.
Furthermore, the structures generated cease to be the primary reality, and
matter-energy flows now acquire this special status.

From the point of view of the nonlinear dynamics of our planet, the thin rocky
crust on which we live and which we call our land and home is perhaps its
least important component. Indeed, if we waited long enough, if we could
observe planetary dynamics at geological time scales, the rocks and
mountains which define the most stable and durable traits of our reality
would dissolve into the great underground lava flows of which they are but
temporary hardenings. Indeed, given that it is just a matter of time for any
one rock or mountain to be reabsorbed into the self-organized flows of lava
driving the dynamics of the lithosphere, these geological structures
represent a local slowing-down in this flowing reality. It is almost as if every
part of the mineral world could be defined by specifying its chemical

http://www.t0.or.at/delanda/geology.htm (11 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

composition and its speed of flow : very slow for rocks, faster for lava.

Similarly, our individual bodies and minds are mere coagulations or


decelerations in the flows of biomass, genes, memes and norms. Here too
we would be defined both by the materials we are temporarily binding or
chaining into our organic bodies and cultural minds, as well as by the time
scale of the binding operation. Given long enough time scales, it is the flow
of biomass through food webs that matters, as well as the flow of genes
through generations, and not the bodies and species that emerge in these
flows. Given long enough time scales, our languages are also momentary
slowing-downs or thickenings in a flow of norms that can give rise to a
multitude of different structures. The overall world-view that this "geological
philosophy" generates may be put into a nut shell by introducing some
special technical terminology.

First of all, the fact that meshworks and hierarchies occur mostly in mixtures,
makes it convenient to have a label to refer to these changing combinations.
If the hierarchical components of the mix dominate over the meshwork ones
we may speak of a highly stratified structure, while the opposite combination
will be referred to as one with a low degree of stratification. Moreover, since
meshworks give rise to hierarchies and hierarchies to meshworks, we may
speak of a given mixture as undergoing processes of destratification as well
as restratification, as its proportions of homogenous and heterogeneous
components change. Finally, since according to this way of viewing things
what truly defines the real world are neither uniform strata nor variable
meshworks but the unformed and unstructured morphogenetic flows from
which these two derive, it will also be useful to have a label to refer to this
special state of matter-energy-information, to this flowing reality animated
from within by self-organizing processes constituting a veritable non-organic
life : the Body without Organs (BwO):

"The organism is not at all the body, the BwO; rather it is a stratum on the
BwO, in other words, a phenomenon of accumulation, coagulation, and
sedimentation that, in order to extract useful labor from the BwO, imposes
upon it forms, functions, bonds, dominant and hierarchized organizations,
organized trascendences...the BwO is that glacial reality where the
alluvions, sedimentations, coagulations, foldings, and recoilings that

http://www.t0.or.at/delanda/geology.htm (12 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

compose an organism -and also a signification and a subject- occur. " {17}

The label itself is, of course, immaterial and insignificant. We could as well
refer to this cauldron of non-organic life by a different name. (Elsewhere, for
instance, I called it the "machinic phylum"). {18} Unlike the name, however,
the referent of the label is of extreme importance, since the flows of lava,
biomass, genes, memes, norms, money (and many others) are crucial for
the emergence of just about any stable structure that we cherish and value
(or, on the contrary, that oppresses and slaves us). We could define the
BwO in terms of this unformed, destratified flows, as long as we keep in
mind that what counts as destratified at any given time and space scale is
entirely relative. The flow of genes and biomass are "unformed" if we
compare them to any given individual organism, but they themselves have
internal forms and functions. Indeed, if instead of taking a planetary
perspective we adopted a properly cosmic viewpoint, our entire planet
(together with its flows) would itself be a mere provisional hardening in the
vast flows of plasma which permeate the universe.

Human history has involved a variety of Bodies without Organs. First, the
sun, that giant sphere of plasma which intense flow of energy drives most
processes of self-organization in our planet, and in the form of grain and
fossil fuel, in our civilizations. Second, the body of lava "conveyor
belts" (convection cells) which drive plate tectonics and which are
responsible for the most general geopolitical features of our planet, such as
the breakdown of Pangea into our current continents, and the subsequent
distribution of domesticable species, a distribution that benefited Eurasia
over the rest of the world. Third, the BwO constituted by the coupled
dynamics of the Hydrosphere/Atmosphere, and their wild variety of self-
organized entities: hurricanes, tsunamis, pressure blocks, cyclones, and
wind circuits. (The conquest of the wind circuits of the Atlantic, the trade
winds and the westerlies, is what allowed the transformation of the American
continent into a vast supply zone to fuel the growth of the European urban
economy).

Fourth, the genetic BwO constituted by the more or less free flow of genes
through microorganisms (via plasmids and other vectors), which unlike the
more stratified genetic flow in animals and plants, has avoided human

http://www.t0.or.at/delanda/geology.htm (13 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

control even after the creation of antibiotics. Fifth, those portions of the flow
of solar energy through ecosystems (flesh circulating in natural food webs)
which have escaped urbanization, particularly animal and vegetable weeds
or rhizomes (The BwO formed by underground rodent cities, for example).
Finally, our languages, when they formed dialect continua and
circumstances conspired to remove any stratifying pressure, also formed a
BwO, as when the Norman invaders imposed French as the language of the
elites allowing the peasant masses to create the English language out of an
amorphous soup of Germanic norms with Scandinavian spices.

Immanent to the BwO are a set of abstract machines, engineering diagrams


capturing the dynamics of certain structure-generating processes. The two
most general may be those behind the formation of strata and self-
consistent aggregates. But there are others. For instance, when the sorting
device is coupled with the ability to replicate with variation, a new abstract
machine emerges, this time a blind probe head or searching device, capable
of exploring a space of possible forms. These abstract machines may be
viewed as equipped with "knobs" controlling certain parameters which in turn
define the dynamical state for the structure-generating process, and hence,
the nature of the generated structures themselves. Key parameters include
those controlling the strength and thoroughness of the sorting process and
the degree of consolidation or reproductive isolation for the double-
articulation machine; or the degrees of temperature, pressure, volume,
speed, density, connectivity that play the role of control parameters
generating the stable states in meshworks; or the rates of mutation and
recombination which define the speed of the probe head, as well as the
strength of biomass flow and of the coupling between coevolving species,
which define the kind of space that the searching device explores.

Hence, using these abstract diagrams to represent what goes on in the BwO
is equivalent to using a system of representation in terms of intensities,
since it is ultimately the intensity of each parameter that determines the kind
of dynamic involved, and hence, the character of the structures that are
generated. Indeed, one way of picturing the BwO is as that "glacial" state of
matter-energy-information which results from turning all these knobs to
zero , that is, to the absolute minimum value of intensity, bringing all
production of structured form to a halt:

http://www.t0.or.at/delanda/geology.htm (14 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

"A BwO is made in such a way that it can be occupied, populated only by
intensities. Only intensities pass and circulate. Still, the Bwo is not a scene,
a place, or even a support upon which something comes to pass... It is not
space, nor is it in space; it is matter that occupies space to a given degree -
to the degree corresponding to the intensities produced. It is nonstratified,
unformed, intense matter, the matrix of intensity, intensity=0 ... Production of
the real as an intensive magnitude starting at zero." {19}

I must end now this brief exploration of what a Neo-Materialist interpretation


of the philosophy of Deleuze and Guattari would be like. No doubt much
detail has been left out. I like to think of my interpretation as a kind of
"pidginization" of their complex ideas, followed by a "creolization" along
original lines. It nevertheless retains its basic geological spirit, a
philosophical stance which rejects ideas of progress not only in human
history but in natural history as well. Living creatures, according to this
stance, are in no way "better" than rocks. Indeed, in a nonlinear world in
which the same basic processes of self-organization take place in the
mineral, organic and cultural spheres, perhaps rocks hold some of the keys
to understand sedimentary humanity, igneous humanity and all their
mixtures.

References:

{1} Herbert Simon. The Sciences of the Artificial. (MIT Press, 1994). p. 32-
36

{2} Fernand Braudel. The Wheels of Commerce. (Harper and Row, New
York, 1986). p. 28-47

For the idea that "invisible hand" economics simply assumes that demand
and supply cancel each other out (i.e. that markets clear) without ever
specifying the dynamics that lead to this state see:

Philip Mirowsky. More Heat than Light. Economics as Social Physics,


Physics as Nature is Economics. (Cambridge University Press, New York
1991). p. 238-241

http://www.t0.or.at/delanda/geology.htm (15 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

Mirowsky shows how the concept of the concept of the "invisible hand" was
formalized in the nineteenth century by simply copying the form of
equilibrium thermodynamics (Hence, in his opinion, this branch of physics
provided more heat than light). He also warns that recent attempts to apply
Prigogineis theories to economics are doing the same thing, for example,
assuming the existence of attractors without specifying just what it is that is
being dissipated (i.e. only energetically dissipative or "lossy" systems have
attractors). See:

Philip Mirowsky. From Mandelbrot to Chaos in Economic Theory. In


Southern Economic Journal. Vol. 57, October 1990). p. 302

{3} Gilles Deleuze and Felix Guattari. A Thousand Plateaus. (University of


Minnesota Press, Minneapolis, 1987). p. 335

"Stating the distinction in its more general way, we could say that it is
between stratified systems or systems of stratification on the one hand, and
consistent, self-consistent aggregates on the other... There is a coded
system of stratification whevever, horizontally, there are linear causalities
between elements; and, vertically, hierarchies of order between groupings;
and, holding it all together in depth, a succession of framing forms, each of
which informs a substance and in turn serves as a substance for another
form. [e.g. the succession pebbles-sedimentary rocks-folded mountains,
footnote#5 below]... On the other hand, we may speak of aggregates of
consistency when instead of a regulated succession of forms-substances we
are presented with consolidations of very heterogeneous elements, orders
that have been short-circuited or even reverse causalities, and captures
between materials and forces of a different nature... ".

{4} Harvey Blatt, Gerard Middleton, Raymond Murray. Origin of Sedimentary


Rocks. (Prentice Hall, New Jersey, 1972). p.102 and 353

{5} Gilles Deleuze and Felix Guattari. A Thousand Plateaus. op. cit. p. 41

Actually, here Deleuze and Guattari incorrectly characterize the two


articulations involved in rock-production as "sedimentation-folding". The
correct sequence is "sedimentation-cementation". Then, at a different spatial

http://www.t0.or.at/delanda/geology.htm (16 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

scale , "cyclic sedimentary rock accumulation-folding into mountain". In other


words, they collapse two different double-articulations (one utilizing as its
starting point the products of the previous one), into one. I believe this
correction does not affect their underlying argument, and that indeed, it
strenghtens it.

{6} Niles Eldridge. Macroevolutionary Dynamics. Species, Niches and


Adaptive Peaks. (MacGraw Hill, New York 1989). p. 127

{7} S.N. Eisenstadt. Continuities and Changes in Systems of Stratification. In


Stability and Social Change. Bernard Barber and Alex Inkeles, eds. (Little
Brown, Boston 1971). p. 65

{8} ibid. p 66-71

{9} Humberto R. Maturana and Francisco J. Varela. The Tree of Knowledge.


The Biological Roots of Human Understanding. (Shambhala, Boston 1992).
p. 47 and 115.

Other researchers have discovered that as the loop adds new nodes it may
reach a critical threshold of complexity and undergo a bifurcation, a
transition to a new state where complexification accelerates. Since the
states to which a phase transition leads are in no way "directed" or
"progressive", changing and developing by crossing bifurcations is another
way of growing by drift.

{10} Ilya Prigogine and Isabelle Stengers. Order out of Chaos. op.cit.. p. 147

{11} Francisco J. Varela. Two Principles of Self-Organization. In Self-


Organization and Management of Social Sytems. H. Ulrich, G.J.B. Probst
eds. (Springer Verlag, Berlin 1984) p. 27

{12} Gilles Deleuze and Felix Guattari. A Thousand Plateaus. op.cit. p.329

{13} Michael Bisacre. Encyclopedia of the Earthis Resources. (Exeter


Books, New York 1984). p. 79

{14} Gilles Deleuze and Felix Guattari. ibid. p. 328


http://www.t0.or.at/delanda/geology.htm (17 z 20)2006-02-23 23:42:15
Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

The authors constantly refer to catalysis in their theories of meshwork-like


structures (rhizomes, smooth spaces etc.) They tend sometimes to view
catalysis in terms on one specific (albeit very important) type of catalysts:
the allosteric enzymes discovered by Jaques Monod, which are like
programmable catalysts, with two heads:
"... what holds heterogeneities together without their ceasing to be
heterogeneous ... are intercallary oscillations, synthesizers with a least two
heads." ibid. 329

What is needed here is to make the notion of a "catalyst" more abstract so


that the specific functions of a chemical catalyst (to perform acts of
recognition via a lock and key mechanism, to accelerate or decelerate
chemical reactions) are not what matters, but the more general notion of
aiding growth "from within" or "from in-between". One step in this direction
has been taken by Arthur Iberall (a pioneer in the application of ideas from
nonlinear dynamics to human history), by defining catalytic activity as the
ability to force a dynamical system from one attractor to another. In the case
of a chemical catalyst the dynamical system would be the target molecule
(the one to be catalyzed) and the two stable states would be its "unreactive"
and "reactive" states, so that by switching them from one to another the
catalyst accelerates the reaction. See:
Arthur Iberall and Harry Soodak. A Physics for Complex Systems. In Self-
Organizing Systems. The Emergence of Order. Eugene Yates ed. (Plenum
Press, New York 1987). p. 509

Elsewhere, Iberall notes that in this sense, nucleation events and


dislocations may be considered to involve "acts of catalysis". Nucleation
refers to the process through which the structures which appear after a
phase transition (crystals just after the bifurcation to the solid state, for
example) consolidate and grow, as opposed to reverting back to the
previous state. (By crossing the bifurcation in the opposite direction).
Typically, something has to catalyze the growth of structure to a critical
mass (nucleation threshold) after which growth may proceed more or less
spontaneously. This "something" may be anything from a dust particle to a
defect in the container in which the crystallization is happening. If one
carefully removes all particles and defects one can indeed cool down a

http://www.t0.or.at/delanda/geology.htm (18 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

liquid past the bifurcation point without crystallization taking place.


(Eventually, as we cool down further, even a microscopic thermal fluctuation
can act as catalyst an trigger the nucleation). Dislocations , on the other
hand, are line defects within the body of the growing crystals which help
them grow by storing mechanical energy in their misaligned (hence
nonequilibrium) composing atoms. This stored energy allows them to
promote crystal growth by lowering nucleation thresholds. Thus, in this
abstract sense of "catalysis" the intercallary events involved in the creation
of igneous rocks are of the meshwork-generating type. On this see:

Arthur Iberall. Toward a General Science of Viable Systems. (McGrall-Hill,


1972). p. 208

But we can go further. Defined this way, "catalysis" becomes a true abstract
operation: anything that switches a dynamical system (an interacting
population of molecules, ants, humans or institutions) from one stable state
to another is literally a catalyst in this sense. Hence, we may use this
definition not only to move down from chemistry (the field of the literal
application of the term) to physics without metaphor, but also up, to biology,
sociology, linguistics. Cities and institutions, for example, would be
instanciations of this operator to the extent that they arise form matter-
energy flows and decision-making processes, but then react back on these
flows and processes to constrain them in a variety of ways (stimulating them
or inhibiting them). On the other hand, as Iberall himself notes, catalytic
constraints may combine with one another and form language-like systems.
Another physicist, Howard Pattee, has further elaborated the notion of
enzymes (organic catalysts) as syntactical constraints, operating on a
semantic world defined by its stable states.

On biological catalysts as syntactic constraints see:


Howard Pattee. Instabilities and Information in Biological Self-Organization.
In Self-Organizing Systems. op. ed. p. 334

{15} Gregoire Nicolis and Ilya Prigogine. Exploring Complexity. (W.H.


Freeman, New York 1989). p. 29

{16} Gilles Deleuze and Felix Guattari. ibid. p. 335

http://www.t0.or.at/delanda/geology.htm (19 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, THE GEOLOGY OF MORALS

{17} Gilles Deleuze and Felix Guattari. A Thousand Plateaus. op. cit. p.159

{18} While the term "Body Without Organs" was first used in a philosophical
context by Deleuze (borrowing from Artaud), the almost synonymous
"machinic phylum" seems to have been coined and first used bt Guattari, in:

Felix Guattari. The Plane of Consistency. In Molecular Revolution. (Penguin


Books, New York 1984). p. 120

I do not claim that the two terms are strictly synonymous (although I myself
do use them that way). Rather what seems to be happening is that these
philosophers instead of building one theory are attempting to create a
meshwork of theories, that is, a set of partially overlapping theories. Hence,
key (near synonymous) concepts (BWO, phylum, smooth space, rhizome)
do not exactly coincide in meaning but are slightly displaced from one
another to create this overlapping effect. The point remains that it is the
referents of these labels that matter and not the labels themselves.

{19} ibid. p. 153

http://www.t0.or.at/delanda/geology.htm (20 z 20)2006-02-23 23:42:15


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

UNIFORMITY and VARIABILITY

An Essay in the Philosophy of Matter

by Manuel De Landa.

Uniformity and Variability:. The development of the science and engineering


of materials in this century has many aspects which promise to enrich the
conceptual reservoir of the philosopher of matter. In this essay I would like
to explore a few of the philosophical issues raised by new developments in
materials science, particularly the new awareness of the importance of
studying the behaviour of matter in its full complexity. This awareness has,
in turn, resulted in part from the creation and experimentation with materials
which involve a heterogenous meshwork of components, such as fiberglass
and other composites, as opposed to the simpler and more predictable
behaviour of uniform, homogeneous materials such as industrial-quality
steel.

Cyril Stanley Smith, a metallurgist and an expert in the history of materials,


has explored the development of the philosophy of matter in the West, from
the ancient Greeks to the present day, and has concluded that for the most
part, the study of the complexity and variability of behaviour of materials has
always been the concern of empirically oriented craftmen or engineers, not
of philosophers or scientists. In his own words:

"Through most of history, matter has been a concern of metaphysics more


than physics, and materials of neither. Classical physics at its best turned
matter into mass, while chemistry discovered the atom and lost interest in
properties...[In both metaphysical speculation and scientific research]
sensitivity to the wonderful diversity of real materials was lost, at first

http://www.t0.or.at/delanda/matterdl.htm (1 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

because philosophical thought despised the senses, later because the . . .


the new science could only deal with one thing at a time. It was atomistic, or
at least, simplistic, in its very essence." {1}

This author claims that by the time Greek philosophers like Democritus or
Aristotle developed their philosophies of matter, practically everything about
the behaviour of metals and alloys that could be explored with pre-industrial
technology, was already known to craftmen and blacksmiths. For at least a
thousand years before philosophers began their speculations, this
knowledge was developed on a purely empirical basis, through a direct
interaction with the complex behaviour of materials. Indeed, the early
philosophies of matter may have been derived from observation and
conversation with those "whose eyes had seen and whose fingers had felt
the intricacies of the behaviour of materials during thermal processing or as
they were shaped by chipping, cutting or plastic deformation." {2} For
instance, Aristotleis famous four elements, fire, earth, water and air, may be
said to reflect a sensual awareness of what today we know as energy and
the three main states of aggregation of matter, the solid, liquid and gas
states.

As metaphysical speculation gave special meanings to these four


elementary qualities, their original physical meaning was lost, and the
variability and complexity of real materials was replaced with the uniform
behaviour of a philosophically simplified matter about which one could only
speculate symbolically. It is true that sixteen-century alchemists recovered a
certain respect for a direct interaction with matter and energy, and that
seventeen-century Cartesian philosophers intensely speculated about the
variable properties of different ways of aggregating material components.
But these early attempts at capturing the complexity of physical
transmutations and of the effect of physical structure on the complex
properties of materials, eventually lost to the emergent science of chemistry,
and its almost total concentration on simple behaviour: that of individual
components (such as Lavoisieris oxygen) or of substances that conform to
the law of definite proportions (as in Daltonis atomic theory).

There was, as Cyril Stanley Smith observes, an "immense gain" in these


simplifications, since the exact sciences could not have developed without

http://www.t0.or.at/delanda/matterdl.htm (2 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

them, but the triumph of chemistry was accompanied by a "not insignificant


loss". In particular, the complete concentration of analysis at the level of
molecules caused an almost total disregard for higher levels of aggregation
in solids, but it is there where most complex properties of interest to todayis
material scientist occurr. {3} As it is usual in the history of science, there
were several exceptions. Galileo studied the strenght of materials in the
sixteen-century, and in the seventeenth while Newton was reducing the
variability of material behaviour to questions of mass, his arch-enemy
Robert Hooke was developing the first theory of elasticity. As materials
scientist James Edward Gordon has remarked, "unlike Newton, Hooke was
intensely interested in what went on in kitchens, dockyards, and buildings -
the mundane mechanical arenas of life...Nor did Hooke despised craftmen,
and he probably got the inspiration for at least some of his ideas from his
friend the great London clockmaker Thomas Tompion...". {4} Despite the
imporatnt exceptions, I believe it is fair to say that, at least in England, much
more prestige was attached to scientific fields that were not concerned with
these mundane mechanical arenas where materials displayed their full
complex behaviour. This may be one reason why conceptual advances in
the study of materials, such as the key conceptual distinction between stress
and strain (one refering to the forces acting on a material structure, the other
to the behaviour of the structure in response to those forces), were made in
France where applied science was encouraged both officially and socially.
{5}

James Gordon has called the study of the strenght of materials the
Cinderella of science, partly because much of the knowledge was developed
by craftmen, metallurgists and engineers (that is the flow of ideas often ran
from the applied to the pure fields), and partly because by its very nature,
the study of materials involved an interaction between many scientific
disciplines, an interdisciplinary approach which ran counter to the more
prestigious tradition of "pure" specialization. {6} Today, of course, the
interdisciplinary study of complexity, not only in materials but in many other
areas of science, from physics and ecology to economics, is finally taking its
place at the cutting-edge of scientific research. We are begining to
understand that any complex system, whether composed of interacting
molecules, organic creatures or economic agents, is capable of
spontaneoulsy generating order and of actively organizing itself into new

http://www.t0.or.at/delanda/matterdl.htm (3 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

structures and forms. It is precisely this ability of matter and energy to self-
organize that is of greatest significance to the philosopher. Let me illustrate
this with an example from materials science.

Long ago, practical metallurgists understood that a given piece of metal can
be made to change its behaviour, from ductile and tough to strong and
brittle, by hammering it while cold. The opposite transmutation, from hard to
ductile, could also be achieved by heating the piece of metal again and then
allowing it to cool down slowly (that is, by annealing it). Yet, although
blacksmiths knew empirically how to cause these metamorphoses, it was
not until a few decades ago that scientists understood its actual microscopic
mechanism. As it turns out, explaining the physical basis of ductility involved
a radical conceptual change: scientists had to stop viewing metals in static
terms, that is, as deriving their strenghth in a simple way from the chemical
bonds between their composing atoms, and begin seeing them as
dynamical systems. In particular, the real cause of brittleness in rigid
materials, and the reason why ductile ones can resist being broken, has to
do with the complex dynamics of spreading cracks.

A crack or fracture needs energy to spread through a piece of material and


so any mechanism that takes away energy from the crack will make the
material tough. In metals, the mechanism seems to be based on certain
defects or imperfections within the component crystals called dislocations.
Dislocations not only trap energy locally but moreover, are highly mobile and
may be brought into existance in large quantities by the very concentrations
of stress which tend to break a piece of material. Roughly, if populations of
these line defects are free to move in a material they will endow it with the
capacity to yield locally without breaking, that is, they will make the material
tough. On the other hand, restricted movement of dislocations will result in a
stronger but more brittle material. {7} Both of these properties may be
desirable for different tools, and even within one and the same tool: in a
sword or knife, for instance, the body must be tough while the cutting edge
must be strong.

What matters from the philosophical point of view is precisely that toughness
or strength are emergent properties of a metallic material that result from the
complex dynamical behaviour of some of its components. An even deeper

http://www.t0.or.at/delanda/matterdl.htm (4 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

philosophical insight is related to the fact that the dynamics of populations of


dislocations are very closely related to the population dynamics of very
different entities, such as molecules in a rhythmic chemical reaction,
termites in a nest-building colony, and perhaps even human agents in a
market. In other words, despite the great difference in the nature and
behaviour of the components, a given population of interacting entities will
tend to display similar collective behaviour as long as there is some
feedback in the interactions between components (that is, the interactions
must be nonlinear) and as long as there is an intense enough flow of energy
rushing through the system (that is, the population in question must operate
far from thermodynamic equilibrium). As I will argue in a moment, the idea
that many different material and energetic systems may have a common
source of spontaneous order is now playing a key role in the development of
a new philosophy of matter. But for materials scientists this commonality of
behaviour is of direct practical significance since it means that as they begin
to confront increasingly more complex material properties, they can make
use of tools coming from nonlinear dynamics and nonequilibrium
thermodynamics, tools that may have been developed to deal with completly
different problems. In the words of one author:

". . . during the last years the whole field of materials science and related
technologies has experienced a complete renewal. Effectively, by using
techniques corresponding to strong nonequilibrium conditions, it is now
possible to escape from the constraints of equilibrium thermodynamics and
to process totally new material structures including different types of
glasses, nano- and quasi-crystals, superlaticces . . . As materials with
increased resistance to fatigue and fracture are sought for actual
applications, a fundamental understanding of the collective behaviour of
dislocations and point defects is highly desirable. Since the usual
thermodynamic and mechanical concepts are not adapted to describe those
situations, progress in this direction should be related to the explicit use of
genuine nonequilibrium techniques, nonlinear dynamics and instability
theory". {8}

Thus, to the extent that the self-organizing behaviour of populations of


dislocations within ductile metals is basically similar to the spontaneous
collective behaviour in other populations, tools and concepts developed in

http://www.t0.or.at/delanda/matterdl.htm (5 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

very different disciplines may apply across the board, and this may help
legitimize the intrinsic interdisciplinary approach of materials science. As I
just said, however, the common behaviour of different collectivities in
nonlinear, nonequilibrium conditions is of even greater importance to the
philosopher of matter. This is very clear in the philosophy of Gilles Deleuze
and Felix Guattari who are perhaps the most radical contemporary
representatives of this branch of philosophy. Inspired in part by some early
versions of complexity theory (e.g. Rene Thomis catastrophe theory, and the
theories of technology of Gilbert Simondon) these authors arrived at the idea
that all structures, whether natural or social, are indeed different expressions
of a single matter-energy behaving dynamically, that is, matter-energy in
flux, to which they have given the name of "machinic phylum". In their words:
". . .the machinic phylum is materiality, natural or artificial, and both
simultaneoulsy; it is matter in movement, in flux, in variation. . ." {9}

The term "phylum" is used in biology to refer to the common body-plan of


many different creatures. Human beings, for example, belong to the phylum
"chordata", as do all other vertebrate animals. The basic idea is that of a
common source of form, a body-plan which through different foldings and
stretchings during embryological development, is capable of generating a
wide variety of specific forms, from snakes, to giraffes to humans. Deleuze
and Guattari, aware that nonlinear population processes are common not
only to animals and plants but to metals and other inorganic materials, have
extended this meaning to refer to a common source of spontaneoulsy
generated form across all material entities. I began this essay by quoting the
opinion of a metallurgist, Cyril Stanley Smith, on the historical importance of
sensually aquired knowledge about the complex behaviour of metals and
other materials. And indeed, in Deleuze and Guattariis philosophy of matter,
metallurgists play an important role:

". . . what metal and metallurgy bring to light is a life proper to matter, a vital
state of matter as such, a material vitalism that doubtless exists everywhere
but is ordinarly hidden or covered, rendered unrecognizable . . . Metallurgy
is the consciousness or thought of the matter-flow, and metal the correlate
of this consciousness. As expressed in panmetallism, metal is coextensive
to the whole of matter, and the whole of matter to metallurgy. Even the
waters, the grasses and varieties of wood, the animals are populated by

http://www.t0.or.at/delanda/matterdl.htm (6 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

salts or mineral elements. Not everything is metal, but metal is


everywhere. . . The machinic phylum is metallurgical, or at least has a
metallic head, as its itinerant probe-head or guidance device." {9}

One aspect of the definition of the machinic phylum is of special interest to


our discussion of contemporary materials science. Not only is the phylum
defined in dynamic terms (that is, as matter in motion) but also as "matter in
continuous variation". Indeed, these philosophers define the term "machinic"
precisely as the process through which structures can be created by
bringing together heterogenous materials, that is, by articulating the diverse
as such, without homogenization. In other words, the emphasis here is not
only on the spontaneous generation of form, but on the fact that this
morphogenetic potential is best expressed not by the simple and uniform
behaviour of materials, but by their complex and variable behaviour. In this
sense, contemporary industrial metals, such as mild steel, may not be the
best illustration of this new philosophical conception of matter. While
naturally ocurring metals contain all kinds of impurities that change their
mechanical behaviour in different ways, steel and other industrial metals
have undergone in the last two hundred years an intense process of
uniformation and homogenization in both their chemical composition and
their physical structure. The rationale behind this process was partly based
on questions of reliability and quality control, but it had also a social
component: both human workers and the materials they used needed to be
disciplined and their behaviour made predictable. Only then the full
efficiencies and economies of scale of mass production techniques could be
realized. But this homogenization also affected the engineers that designed
structures using this well disciplined materials. In the words of James E.
Gordon:

"The widespread use of steel for so many purposes in the modern world is
only partly due to technical causes. Steel, especially mild steel, might
euphemistically be described as a material that facilitates the dilution of
skills. . . Manufacturing processes can be broken down into many separate
stages, each requiring a minimum of skill or intelligence. . . At a higher
mental level, the design process becomes a good deal easier and more
foolproof by the use of a ductile, isotropic, and practically uniform material
with which there is already a great deal of accumulated experience. The

http://www.t0.or.at/delanda/matterdl.htm (7 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

design of many components, such as gear wheels, can be reduced to a


routine that can be looked up in handbooks." {10}

Gordon sees in the spread of the use of steel in the late nineteen- and early
twenty centuries, a double danger for the creativity of structural designers.
The first danger is the idea that a single, universal material is good for all
different kinds of structure, some of which may be supporting loads in
compression, some in tension, some withstanding shear stresses and others
torsional stresses. But as Gordon points out, given that the roles which a
structure may play can be highly heterogenous, the repertoir of materials
that a designer uses should reflect this complexity. On the other hand, he
points out that, much as in the case of biological materials like bone, new
designs may involve structures with properties that are in continuous
variation, with some portions of the structure better able to deal with
compression while others deal with tension. Intrinsically heterogenous
materials, such as fiberglass and the newer hi-tech composites, afford
designers this possibility. As Gordon says, "it is scarcely practicable to
tabulate elaborate sets of "typical mechanical properties" for the new
composites. In theory, the whole point of such materials is that, unlike
metals, they do not have "typical properties, because the material is
designed to suit not only each individual structure, but each place in that
structure." {11}

I do not mean to imply that there are no legitimate roles to be played by


homogenous materials with simple and predictable behaviour, such as
bearing loads in compression. And similarly for the institutional and
economic arrangements that were behind the quest for uniformity: the
economies of scale achieved by routinizing production and some design
tasks, were certainly very significant. As with the already mentioned
homogenizations performed by scientists in their conceptions of matter,
there were undoubtedly some gains. The question is, what got lost in the
process. I can think of several things.

First, the nineteenth century process of transfering skills from the human
worker to the machine, and the task of homogenizing metallic behaviour
went hand in hand. As Cyril Stanley Smith remarks "The craftman can
compensate for differences in the qualities of his material, for he can adjust

http://www.t0.or.at/delanda/matterdl.htm (8 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

the precise strength and pattern of application of his tools to the materialis
local vagaries. Conversely, the constant motion of a machine requires
constant materials." {12} If its is true as I said at the beggining of this essay
that much of the knowledge about the complex behaviour of materials was
developed outside science by empirically oriented individuals, the deskilling
of craftmen that accompanied mechanization may be seen as involving a
loss of at least part of that knowledge, since in many cases empirical know-
how is stored in the form of skills.

Second, as I just said, not only the production process was routinized this
way, so was to a lesser extent the design process. Many professionals who
design load-bearing structures lost their ability to design with materials that
are not isotropic, that is, that do not have identical properties in all
directions. But it is precisely those abilities to deal with complex,
continuously variable bahaviour that are now needed to design structures
with the new composites. Hence, we may need to nurture again our ability to
deal with variation as a creative force, and to think of structures that
incorporate heterogenous elements as a challenge to be met by innovative
design.

Third, the quest for uniformity in human and metallic behaviour went beyond
the specific disciplinary devices used in assembly-line factories. Many other
things became homogenized in the last few centuries. To give only two
examples: the genetic materials of our farm animals and crops have become
much more uniform, at first due to the spread of the "pedigree mystique",
and later in this century, by the development and diffusion of miracle crops,
like hybrid corn. Our linguistic materials also became more uniform as the
meshworks of heterogenous dialects which existed in most countries began
to yield to the spread of standard languages, through compulsory education
systems and the effects mass media. As before, the question is not whether
we achieved some efficiencies through genetic and linguisitic standarization.
We did. The problem is that in the process we came to view heterogeneity
and variation as something to be avoided, as something pathological to be
cured or uprooted since it endangered the unity of the nation state.

Finally, as Deleuze and Guattari point out, the nineteen-century quest for
uniformity may had had damaging effects for the philosophy of matter by

http://www.t0.or.at/delanda/matterdl.htm (9 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

making the machinic phylum effectively unrecognizable. As the behaviour of


metals and other mineral materials became routine, and hence,
unremarkable, philosophical attention became redirected to the more
interesting behaviour of living creatures, as in early twenty-century forms of
vitalism, and later on, to the behaviour of symbols, discourses and texts, in
which any consideration of material or energetic factors was completly lost.
Today, thanks in part to the new theories of self-organization that have
revealed the potential complexity of behaviour of even the humbler forms of
matter-energy, we are begining to recover a certain philosophical respect for
the inherent morphogenetic potential of all materials. And we may now be in
a position to think about the origin of form and structure, not as something
imposed from the outside on an inert matter, not as a hierarchical command
from above as in an assembly line, but as something that may come from
within the materials, a form that we tease out of those materials as we allow
them to have their say in the structures we create.

References:
1) Cyril Stanley Smith. Matter Versus Materials: A Historical View. In A
Search for Structure. (MIT Press, 1992). p. 115

2) ibid. p.115

3) ibid. p. 120 and 121

4) James Edward Gordon. The Science of Structures and Materials.


(Scientific American Library, 1988). p. 18

5) ibid. p. 21 and 22

6) ibid p. 3

7) ibid. p. 111

8) D. Walgraef. Pattern Selection and Symmetry Competition in Materials


Instabilities. In New Trends in Nonlinear Dynamics and Pattern-Forming
Phenomena. Pierre Coullet and Patrick Huerre eds. (Plenum Press 1990). p.
26

http://www.t0.or.at/delanda/matterdl.htm (10 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, Uniformity and Variability

9) Gilles Deleuze and Felix Guattari. A Thousand Plateaus. (University of


Minnesota Press, 1980) p. 409

10) James Edward Gordon. op. cit. p. 135

11) ibid. p. 200

12) Cyril Stanley Smith. ibid p. 313

http://www.t0.or.at/delanda/matterdl.htm (11 z 11)2006-02-23 23:42:27


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

MESHWORKS, HIERARCHIES AND INTERFACES

by Manuel De Landa

The world of interface design is today undergoing dramatic changes which


in their impact promise to rival those brought about by the use of the point-
and-click graphical interfaces popularized by the Macintosh in the early
1980's. The new concepts and metaphors which are aiming to replace the
familiar desk-top metaphor all revolve around the notion of semi-
autonomous, semi-intelligent software agents. To be sure, different
researchers and commercial companies have divergent conceptions of what
these agents should be capable of, and how they should interact with
computer users. But whether one aims to give these software creatures the
ability to learn about the users habits, as in the non-commercial research
performed at MIT autonomous agents group, or to endow them with the
ability to perform transactions in the users name, as in the commercial
products pioneered by General Magic, the basic thrust seems to be in the
direction of giving software programs more autonomy in their decision-
making capabilities.

For a philosopher there are several interesting issues involved in this new
interface paradigm. The first one has to do with the history of the software
infrastructure that has made this proliferation of agents possible. From the
point of view of the conceptual history of software, the creation of worlds
populated by semi-autonomous virtual creatures, as well as the more
familiar world of mice, windows and pull-down menus, have been made
possible by certain advances in programming language design. Specifically,
programming languages needed to be transformed from the rigid hierarchies
which they were for many years, to the more flexible and decentralized
structure which they gradually adopted as they became more "object-

http://www.t0.or.at/delanda/meshwork.htm (1 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

oriented". One useful way to picture this transformation is as a migration of


control from a master program (which contains the general task to be
performed) to the software modules which perform all the individual tasks.
Indeed, to grasp just what is at stake in this dispersal of control, I find it
useful to view this change as a part of a larger migration of control from the
human body, to the hardware of the machine, then to the software, then to
the data and finally to the world outside the machine. Since this is a crucial
part of my argument let me develop it in some detail.

The first part of this migration, when control of machine-aided processes


moved from the human body to the hardware, may be said to have taken
place in the eighteenth century when a series of inventors and builders of
automata created the elements which later came together in the famous
Jacquard loom, a machine which automated some of the tasks involved in
weaving patterns in textiles. Jacquards loom used a primitive form of
software, in which holes punched into cards coded for some of the
operations behind the creation of patterned designs. {1} This software,
however, contained only data and not control structures. In other words, all
that was coded in the punched cards was the patterns to be weaved and not
any directions to alter the reading of the cards or the performance of the
operations, such as the lifting of the warp threads. Therefore, it was the
machine's hardware component that "read" the cards and translated the
data into motion, in which control of the process resided. Textile workers at
the time were fully aware that they had lost some control to Jacquards loom,
and they manifested their outrage by destroying the machines in several
occasions.

The idea of coding data into punched cards spread slowly during the 1800's,
and by the beginning of our century it had found its way into computing
machinery, first the tabulators used by Hollerith to process the 1890 United
States census, then into other tabulators and calculators. In all these cases
control remained embodied in the machine's hardware. One may go as far
as saying that even the first modern computer, the imaginary computer
created by Alan Turing in the 1930's still kept control in the hardware, the
scanning head of the Turing machine. The tape that his machine scanned
held nothing but data. But this abstract computer already had the seed of
the next step, since as Turing himself understood, the actions of the

http://www.t0.or.at/delanda/meshwork.htm (2 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

scanning head could themselves be represented by a table of behavior, and


the table itself could now be coded into the tape. Even though people may
not have realized this at the time, coding both numbers and operations on
numbers side by side on the tape, was the beginning of computer software
as we know it. {2} When in the 1950's Turing created the notion of a
subroutine, that is, the notion that the tasks that a computer must perform
can be embodied into separate sub-programs all controlled by a master
program residing in the tape, the migration of control from hardware to
software became fully realized. From then on, computer hardware became
an abstract mesh of logical gates, its operations fully controlled by the
software.

The next step in this migration took place when control of a given
computational process moved from the software to the very data that the
software operates on. For as long as computer languages such as
FORTRAN or Pascal dominated the computer industry, control remained
hierarchically embedded in the software. A master program would surrender
control to a subroutine whenever that sub-task was needed to be performed,
and the subroutine itself may pass control to an even more basic subroutine.
But the moment the specific task was completed, control would move up the
hierarchy until it reached the master program again. Although this
arrangement remained satisfactory for many years, and indeed, many
computer programs are still written that way, more flexible schemes were
needed for some specific, and at the time, esoteric applications of
computers, mostly in Artificial Intelligence.

Trying to build a robot using a hierarchy of subroutines meant that


researchers had to completely foresee all the tasks that a robot would need
to do and to centralize all decision-making into a master program. But this,
of course, would strongly limit the responsiveness of the robot to events
occurring in its surroundings, particularly if those events diverged from the
predictions made by the programmers. One solution to this was to
decentralize control. The basic tasks that a robot had to perform were still
coded into programs, but unlike subroutines these programs were not
commanded into action by a master program. Instead, these programs were
given some autonomy and the ability to scan the data base on their own.
Whenever the found a specific pattern in the data they would perform

http://www.t0.or.at/delanda/meshwork.htm (3 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

whatever task they were supposed to do. In a very real sense, it was now
the data itself that controlled the process. And, more importantly, if the data
base was connected to the outside world via sensors, so that patterns of
data reflected patterns of events outside the robot, then the world itself was
now controlling the computational process, and it was this that gave the
robot a degree of responsiveness to its surroundings.

Thus, machines went from being hardware-driven, to being software-driven,


then data-driven and finally event-driven. Your typical Macintosh computer is
indeed an event-driven machine even if the class of real world events that it
is responsive to is very limited, including only events happening to the
mouse (such as position changes and clicking) as well as to other input
devices. But regardless of the narrow class of events that personal
computers are responsive to, it is in these events that much of the control of
the processes now resides. Hence, behind the innovative use of windows,
icons, menus and the other familiar elements of graphical interfaces, there is
this deep conceptual shift in the location of control which is embodied in
object-oriented languages. Even the new interface designs based on semi-
autonomous agents were made possible by this decentralization of control.
Indeed, simplifying a little, we may say that the new worlds of agents,
whether those that inhabit computer screens or more generally, those that
inhabit any kind of virtual environment (such as those used in Artificial Life),
have been the result of pushing the trend away from software command
hierarchies ever further.

The distinction between centralized and decentralized control of given


process has come to occupy center-stage in many different contemporary
philosophies. It will be useful to summarize some of this philosophical
currents before I continue my description of agent-based interfaces, since
this will reveal that the paradigm-shift is by no means confined to the area of
software design. Economist and Artificial Intelligence guru Herbert Simon
views bureaucracies and markets as the human institutions which best
embody these two conceptions of control.{3} Hierarchical institutions are the
easiest ones to analyze, since much of what happens within a bureaucracy
in planned by someone of higher rank, and the hierarchy as a whole has
goals and behaves in ways that are consistent with those goals. Markets, on
the other hand, are tricky. Indeed, the term "market" needs to be used with

http://www.t0.or.at/delanda/meshwork.htm (4 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

care because it has been greatly abused over the last century by theorists
on the left and the right. As Simon remarks, the term does not refer to the
world of corporations, whether monopolies or oligopolies, since in these
commercial institutions decision-making is highly centralized, and prices are
set by command.

I would indeed limit the sense of the term even more to refer exclusively to
those weakly gatherings of people at a predefined place in town, and not to
a dispersed set of consumers catered by a system of middleman (as when
one speaks of the "market" for personal computers). The reason is that, as
historian Fernand Braudel has made it clear, it is only in markets in the first
sense that we have any idea of what the dynamics of price formation are. In
other words, it is only in peasant and small town markets that decentralized
decision-making leads to prices setting themselves up in a way that we can
understand. In any other type of market economists simply assume that
supply and demand connect to each other in a functional way, but they do
not give us any specific dynamics through which this connection is effected.
{4} Moreover, unlike the idealized version of markets guided by an "invisible
hand" to achieve an optimal allocation of resources, real markets are not in
any sense optimal. Indeed, like most decentralized, self-organized
structures, they are only viable, and since they are not hierarchical they
have no goals, and grow and develop mostly by drift. {5}

Herbert Simon's distinction between command hierarchies and markets may


turn out to be a special case of a more general dichotomy. In the view of
philosophers Gilles Deleuze and Felix Guattari, this more abstract classes,
which they call strata and self-consistent aggregates (or trees and
rhizomes), are defined not so much by the locus of control, as by the nature
of elements that are connected together. Strata are composed of
homogenous elements, whereas self-consistent aggregates articulate
heterogeneous elements as such. {6} For example, a military hierarchy sorts
people into internally homogenous ranks before joining them together
through a chain of command. Markets, on the other hand, allow for a set of
heterogeneous needs and offers to become articulated through the price
mechanism, without reducing this diversity. In biology, species are an
example of strata, particularly if selection pressures have operated
unobstructedly for long periods of time allowing the homogenization of the

http://www.t0.or.at/delanda/meshwork.htm (5 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

species gene pool. On the other hand, ecosystems are examples of self-
consistent aggregates, since they link together into complex food webs a
wide variety of animals and plants, without reducing their heterogeneity. I
have developed this theory in more detail elsewhere, but for our purposes
here let's simply keep the idea that besides centralization and
decentralization of control, what defines these two types of structure is the
homogeneity or heterogeneity of its composing elements.

Before returning to our discussion of agent-based interfaces, there is one


more point that needs to be stressed. As both Simon and Deleuze and
Guattari emphasize, the dichotomy between bureaucracies and markets, or
to use the terms that I prefer, between hierarchies and meshworks, should
be understood in purely relative terms. In the first place, in reality it is hard to
find pure cases of these two structures: even the most goal-oriented
organization will still show some drift in its growth and development, and
most markets even in small towns contain some hierarchical elements, even
if it is just the local wholesaler which manipulates prices by dumping (or
withdrawing) large amounts of a product on (or from) the market. Moreover,
hierarchies give rise to meshworks and meshworks to hierarchies. Thus,
when several bureaucracies coexist (governmental, academic, ecclesiastic),
and in the absence of a super-hierarchy to coordinate their interactions, the
whole set of institutions will tend to form a meshwork of hierarchies,
articulated mostly through local and temporary links. Similarly, as local
markets grow in size, as in those gigantic fairs which have taken place
periodically since the Middle Ages, they give rise to commercial hierarchies,
with a money market on top, a luxury goods market underneath and, after
several layers, a grain market at the bottom. A real society, then, is made of
complex and changing mixtures of these two types of structure, and only in
a few cases it will be easy to decide to what type a given institution belongs.

A similar point may be made about the worlds inhabited by software agents.
The Internet, to take the clearest example first, is a meshwork which grew
mostly by drift. No one planned either the extent or the direction of its
development, and indeed, no one is in charge of it even today. The Internet,
or rather its predecessor, the Arpanet, acquired its decentralized structure
because of the needs of U.S. military hierarchies for a command and
communications infrastructure which would be capable of surviving a

http://www.t0.or.at/delanda/meshwork.htm (6 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

nuclear attack. As analysts from the Rand Corporation made it clear, only if
the routing of the messages was performed without the need for a central
computer could bottlenecks and delays be avoided, and more importantly,
could the meshwork put itself back together once a portion of it had been
nuclearly vaporized. But in the Internet only the decision-making behind
routing is of the meshwork type. Decision-making regarding its two main
resources, computer (or CPU) time and memory, is still hierarchical.

Schemes to decentralize this aspect do exist, as in Drexler's Agoric


Systems, where the messages which flow through the meshwork have
become autonomous agents capable of trading among themselves both
memory and CPU time. {7} The creation by General Magic of its Teletext
operating system, and of agents able to perform transactions on behalf of
users, is one of the first real-life steps in the direction of a true
decentralization of resources. But in the meanwhile, the Internet will remain
a hybrid of meshwork and hierarchy components, and the imminent entry of
big corporations into the network business may in fact increase the amount
of command components in its mix.

These ideas are today being hotly debated in the field of interface design.
The general consensus is that interfaces must become more intelligent to be
able to guide users in the tapping of computer resources, both the
informational wealth of the Internet, as well as the resources of ever more
elaborate software applications. But if the debaters agree that interfaces
must become smarter, and even that this intelligence will be embodied in
agents, they disagree on how the agents should acquire their new
capabilities. The debate pits two different traditions of Artificial Intelligence
against each other: Symbolic AI, in which hierarchical components
predominate, against Behavioral AI, where the meshwork elements are
dominant. Basically, while in the former discipline one attempts to endow
machines with intelligence by depositing a homogenous set of rules and
symbols into a robot's brain, in the latter one attempts to get intelligent
behavior to emerge from the interactions of a few simple task-specific
modules in the robot's head, and the heterogeneous affordances of its
environment. Thus, to build a robot that walks around a room, the first
approach would give the robot a map of the room, together with the ability to
reason about possible walking scenarios in that model of the room. The

http://www.t0.or.at/delanda/meshwork.htm (7 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

second approach, on the other hand, endows the robot with a much simpler
set of abilities, embodied in modules that perform simple tasks such as
collision-avoidance, and walking-around-the-room behavior emerges from
the interactions of these modules and the obstacles and openings that the
real room affords the robot as it moves.{8}

Translated to the case of interface agents, for instance, personal assistants


in charge of aiding the user to understand the complexities of particular
software applications, Symbolic AI would attempt to create a model of the
application as well as a model of the working environment, including a
model of an idealized user, and make these models available in the form of
rules or other symbols to the agent. Behavioral AI, on the other hand, gives
the agent only the ability to detect patterns of behavior in the actual user,
and to interact with the user in different ways so as to learn not only from his
or her actual behavior but also from feedback that the user gives it. For
example, the agent in question would be constantly looking over the user's
shoulder keeping track on whatever regular or repetitive patterns it
observes. It then attempts to establish statistical correlations between
certain pairs of actions that tend to occur together. At some point the agent
suggests to the user the possibility of automating these actions, that is, that
whenever the first occurs, the second should be automatically performed.
Whether the user accepts or refuses, this gives feedback to the agent. The
agent may also solicit feedback directly, and the user may also teach the
agent by giving some hypothetical examples. {9}

In terms of the location of control, there is very little difference between the
agents that would result, and in this sense, both approaches are equally
decentralized. The rules that Symbolic AI would put in the agents head,
most likely derived from interviews of users and programmers by a
Knowledge Engineer, are independent software objects. Indeed, in one of
the most widely used programming languages in this kind of approach
(called a "production system") the individual rules have even more of a
meshwork structure that many object-oriented systems, which still cling to a
hierarchy of objects. But in terms of the overall human-machine system, the
approach of Symbolic AI is much more hierarchical. In particular, by
assuming the existence of an ideal user, with homogenous and unchanging
habits, and of a workplace where all users are similar, agents created by this

http://www.t0.or.at/delanda/meshwork.htm (8 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

approach are not only less adaptive and more commanding, they
themselves promote homogeneity in their environment. The second class of
agents, on the other hand, are not only sensitive to heterogeneities, since
they adapt to individual users and change as the habits of this users
change, they promote heterogeneity in the work place by not subordinating
every user to the demands of an idealized model.

One drawback of the approach of Behavioral AI is that, given that the agent
has very little knowledge at the beginning of a relationship with a user, it will
be of little assistance for a while until it learns about his or her habits. Also,
since the agent can only learn about situations that have recurred in the
past, it will be of little help when the user encounters new problems. One
possible solution, is to increase the amount of meshwork in the mix and
allow agents from different users to interact with each other in a
decentralized way. {10} Thus, when a new agent begins a relation with a
user, it can consult with other agents and speed up the learning process,
assuming that is, that what other agents have learned is applicable to the
new user. This, of course, will depend on the existence of some
homogeneity of habits, but at least it does not assume a complete
homogenous situation from the outset, an assumption which in turn
promotes further uniformization. Besides, endowing agents with a static
model of the users makes them unable to cope with novel situations. This is
also a problem in the Behavioral AI approach but here agents may aid one
another in coping with novelty. Knowledge gained in one part of the
workplace can be shared with the rest, and new knowledge may be
generated out of the interactions among agents. In effect, a dynamic model
of the workplace would be constantly generated and improved by the
collective of agents in a decentralized way, instead of each one being a
replica of each other operating on the basis of a static model centrally
created by a knowledge engineer.

I would like to conclude this brief analysis of the issues raised by agent-
based interfaces with some general remarks. First of all, from the previous
comments it should be clear that the degree of hierarchical and
homogenizing components in a given interface is a question which affects
more than just events taking place in the computer's screen. In particular,
the very structure of the workplace, and the relative status of humans and

http://www.t0.or.at/delanda/meshwork.htm (9 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

machines is what is at stake here. Western societies have undergone at


least two centuries of homogenization, of which the most visible element is
the assembly-line and related mass-production techniques, in which the
overall thrust was to let machines discipline and control humans. In this
circumstances, the arrival of the personal computer was a welcome antidote
to the development of increasingly more centralized computer machinery,
such as systems of Numerical Control in factories. But this is hardly a
victory. After two hundred years of constant homogenization, working skills
have been homogenized via routinization and Taylorization, building
materials have been given constant properties, the gene pools of our
domestic species homogenized through cloning, and our languages made
uniform through standardization.

To make things worse, the solution to this is not simply to begin adding
meshwork components to the mix. Indeed, one must resist the temptation to
make hierarchies into villains and meshworks into heroes, not only because,
as I said, they are constantly turning into one another, but because in real
life we find only mixtures and hybrids, and the properties of these cannot be
established through theory alone but demand concrete experimentation.
Certain standardizations, say, of electric outlet designs or of data-structures
traveling through the Internet, may actually turn out to promote
heterogenization at another level, in terms of the appliances that may be
designed around the standard outlet, or of the services that a common data-
structure may make possible. On the other hand, the mere presence of
increased heterogeneity is no guarantee that a better state for society has
been achieved. After all, the territory occupied by former Yugoslavia is more
heterogeneous now than it was ten years ago, but the lack of uniformity at
one level simply hides an increase of homogeneity at the level of the warring
ethnic communities. But even if we managed to promote not only
heterogeneity, but diversity articulated into a meshwork, that still would not
be a perfect solution. After all, meshworks grow by drift and they may drift to
places where we do not want to go. The goal-directedness of hierarchies is
the kind of property that we may desire to keep at least for certain
institutions. Hence, demonizing centralization and glorifying decentralization
as the solution to all our problems would be wrong. An open and
experimental attitude towards the question of different hybrids and mixtures
is what the complexity of reality itself seems to call for. To paraphrase

http://www.t0.or.at/delanda/meshwork.htm (10 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

Deleuze and Guattari, never believe that a meshwork will suffice to save us.
{11}

Footnotes:

{1} Abbot Payson Usher. The Textile Industry, 1750-1830. In Technology in


Western Civilization. Vol. 1. Melvin Kranzberg and Carrol W. Pursell eds.
(Oxford University Press, New York 1967). p. 243

{2} Andrew Hodges. Alan Turing: The Enigma. (Simon & Schuster, New
York 1983). Ch. 2

{3} Herbert Simon. The Sciences of the Artificial. (MIT Press, 1994). p.43

{4} Fernand Braudel. The Wheels of Commerce. (Harper and Row, New
York, 1986). Ch. I

{5} Humberto R. Maturana and Francisco J. Varela. The Tree of Knowledge.


The Biological Roots of Human Understanding. (Shambhala, Boston 1992).
p. 47 and 115.

{6} Gilles Deleuze and Felix Guattari. A Thousand Plateaus. (University of


Minnesota Press, Minneapolis, 1987). p. 335

{7} M.S. Miller and K.E. Drexler. Markets and Computation: Agoric Open
Systems. In The Ecology of Computation. Bernardo Huberman ed. (North-
Holland, Amsterdam 1988).

{8} Pattie Maes. Behaviour-Based Artificial Intelligence. In From Animals to


Animats. Vol. 2. Jean-Arcady Meyer, Herbert L. Roitblat and Stewart W.
Wilson. (MIT Press, Cambridge Mass, 1993). p. 3

{9} Pattie Maes and Robyn Kozierok. Learning Interface Agents. In


Proceedings of AAAI È93 Conference. (AAAI Press, Seattle WA. 1993). p.
459-465

{10} Yezdi Lashari, Max Metral and Pattie Maes. Collaborative Interface
Agents. In Proceedings of 12th National Conference on AI. (AAAI Press,
http://www.t0.or.at/delanda/meshwork.htm (11 z 12)2006-02-23 23:42:51
Zero News Datapool, MANUEL DE LANDA, MESHWORKS,HIERARCHIES AND INTERFACES

Seattle WA. 1994). p. 444-449

{11} Deleuze and Guattari. op. cit. p. 500. (Their remark is framed in terms
of "smooth spaces" but it may be argued that this is just another term for
meshworks).

http://www.t0.or.at/delanda/meshwork.htm (12 z 12)2006-02-23 23:42:51


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

VIRTUAL ENVIROMENTS
AND THE EMERGENCE OF SYNTHETIC REASON

by MANUEL DE LANDA

At the end of World War II, Stanislav Ulam and other scientists previously
involved in weapons research at Los Alamos, discovered the huge potential
of computers to create artificial worlds, where simulated experiments could
be conducted and where new hypotheses could be framed and tested. The
physical sciences were the first ones to tap into this "epistemological
reservoir" thanks to the fact that much of their accumulated knowledge had
already been given a mathematical form. Among the less mathematized
disciplines, those already taking advantage of virtual environments are
psychology and biology (e.g. Artificial Intelligence and Artificial Life) ,
although other fields such as economics and linguistics could soon begin to
profit from the new research strategies made possible by computer
simulations.

Yet, before a given scientific discipline can begin to gain from the use of
virtual environments, more than just casting old assumptions into
mathematical form is necessary. In many cases the assumptions
themselves need to be modified. This is clear in the case of Artificial
Intelligence research, much of which is still caught up into older paradigms
of what a symbol-manipulating "mind" should be, and hence has not
benefited as much as it could from the simulation capabilities of computers.
Artificial Life, on the other hand, has the advantage that the evolutionary
biologist's conceptual base has been purged from classical notions of what
living creatures and evolution are supposed to be, and this has put this
discipline in an excellent position to profit from the new research tool

http://www.t0.or.at/delanda/delanda.htm (1 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

represented by these abstract spaces. Since this is a crucial point, let's take
a careful look at just what this "purging"has involved.

The first classical notion that had to be eliminated from biology was the
Aristotelian concept of an "ideal type", and this was achieved by the
development of what came to be known in the 1930's as "population
thinking". In the old tradition that dominated biological thought for over two
thousand years, a given population of animals was conceived as being the
more or less imperfect incarnation of an ideal essence. Thus, for example, in
the case of zebras, there would exist an ideal zebra, embodying all the
attributes which together make for "zebrahood"(being striped, having hoofs
etc. The existence of this essence would be obscured by the fact that in any
given population of zebras the ideal type would be subjected to a multiplicity
of accidents (of embryological development, for instance) yielding as end
result a variety of imperfect realizations. In short, in this view, only the ideal
essence is real, with the variations being but mere shadows.

When the ideas of Darwin on the role of natural selection and those of
Mendel on the dynamics of genetic inheritance were brought together six
decades ago, the domination of the Aristotelian paradigm came to an end. It
became clear, for instance, that there was no such thing as a preexistent
collection of traits defining "zebrahood".Each of the particular adaptive traits
which we observe in real zebras developed along different ancestral
lineages, accumulated in the population under the action of different
selection pressures, in a process that was completely dependent on specific
(and contingent) historical details. In other words, just as these traits
(camouflage, running speed and so on) happened to come together in
zebras, they may not have, had the actual history of those populations been
any different.

Moreover, the engine driving this process is the genetic variability of zebra
populations. Only if zebra genes replicate with enough variability can
selection pressures have raw materials to work with. Only if enough variant
traits arise spontaneously, can the sorting process of natural selection bring
together those features which today define what it is to be a zebra. In short,
for population thinkers, only the variation is real, and the ideal type (e.g. the
average zebra) is a mere shadow. Thus we have a complete inversion of the

http://www.t0.or.at/delanda/delanda.htm (2 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

classical paradigm. 1

Further refinement of these notions have resulted in the more general idea
that the coupling of any kind of spontaneous variation to any kind of
selection pressure results in a sort of "searching device". This "device"
spontaneously explores a space of possibilities (i.e. possible combinations
of traits), and is capable of finding, over many generations, more or less
stable combinations of features, more or less stable solutions to problems
posed by the environment. This "device" has today been implemented in
populations that are not biological. This is the so called "genetic
algorithm" (developed by John Holland) in which a population of computer
programs is allowed to replicate in a variable form, and after each
generation a test is performed to select those programs that most closely
approximate the desired performance. It has been found that this method is
capable of zeroing-in the best solutions to a given programming task. In
essence, this method allows computer scientists to breed new solutions to
problems, instead of directly programming those solutions. 2

The difference between the genetic algorithm and the more ambitious goals
of Artificial Life is the same as that between the action of human breeding
techniques on domesticated plants and animals, and the spontaneous
evolution of the ancestors of those plants and animals. Whereas in the first
case the animal or plant breeder determines the criterion of fitness, in the
second one there is no outside agency determining what counts as fit. In a
way what is fit is simply that which survives, and this has led to the criticism
that Darwinism's central formula (i.e. "survival of the fittest") is a mere
tautology ("survival of the survivor"). Partly to avoid this criticism this formula
is today being replaced by another one: survival of the stable. 3

The central idea, the notion of an "evolutionary stable strategy "was


formulated with respect to behavioural strategies (such as those involved in
territorial or courtship behaviour in animals) but it can be extended to apply
to the "engineering strategies" involved in putting together camouflage,
locomotive speed and the other traits which come together to form the
zebras of the example above. The essence of this approach is that the
"searching device" constituted by variation and selection can find the optimal
solution to a given problem posed by the environment, and that once the

http://www.t0.or.at/delanda/delanda.htm (3 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

optimal solution has been found, any mutant strategy arising in the
population is bound to be defeated. The strategy will be, in this sense, stable
against invasion. To put it invisual terms, it is as if the space of possibilities
explored by the "searching device" included mountains and valleys, with the
mountain peaks representing points of optimal performance. Selection
pressures allow the gene pool of a reproductive population to slowly climb
those peaks, and once a peak has been reached, natural selection keeps
the population there.

One may wonder just what has been achieved by switching from the
concept of a "fittest mutant" to that of an "optimal" one, except perhaps, that
the latter can be defined contextually as "optimal given existing constraints".
However, the very idea that selection pressures are strong enough to pin
populations down to "adaptive peaks" has itself come under intense
criticism. One line of argument says that any given population is subjected
to many different pressures, some of them favoring different optimal results.
For example, the beautiful feathers of a peacock are thought to arise due to
the selection pressure exerted by "choosy" females, who will only mate with
those males exhibiting the most attractive plumage. Yet, those same vivid
colors which seduce the females also attract predators. Hence, the male
peacock's feathers will come under conflicting selection pressures. In these
circumstances, it is highly improbable that the peacock's solution will be
optimal and much more likely that it will represent a compromise. Several
such sub-optimal compromises may be possible, and thus the idea that the
solution arrived at by the "searching device" is unique needs to be
abandoned. 4 But if unique and optimal solutions are not the source of
stability in biology, then what is?.

The answer to this question represents the second key idea around which
the field of Artificial Life revolves. It is also crucial to understand the potential
application of virtual environments to fields such as economics. The old
conceptions of stability (in terms of either optimality or principles of least
effort) derive from nineteenth century equilibrium thermodynamics. It is well
known that philosophers like Auguste Comte and Herbert Spencer (author of
the formula "survival of the fittest") introduced thermodynamic concepts into
social science. However, some contemporary observers complain that what
was so introduced (in economics, for example) represents "more heat than

http://www.t0.or.at/delanda/delanda.htm (4 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

light". 5

In other words, equilibrium thermodynamics, dealing as it does with systems


that are closed to their environment, postulates that stability can only be
reached when all useful energy has been transformed into heat. At this
point, a static and unique state of equilibrium is reached (heat death). It was
this concept of a static equilibrium that late nineteenth century economists
used to systematize the classical notion of an "invisible hand" according to
which the forces of demand and supply tend to balance each other out at a
point which is optimal from the point of view of society's utilization of
resources. It was partly John Von Neumann's work on Game Theory and
economics that helped entrench this notion of stability outside of physics,
and from there it found its way into evolutionary biology, through the work of
John Maynard Smith. 66

This static conception of stability was the second classical idea that needed
to be eliminated before the full potential of virtual environments could be
unleashed. Like population thinking, the fields that provided the needed new
insights (the disciplines of far-from-equilibrium thermodynamics and
nonlinear mathematics) are also a relatively recent development, associated
with the name of Ilya Prigogine, among others. Unlike the
"conservative"systems dealt with by the old science of heat, systems which
are totally isolated from their surroundings, the new science deals with
systems that are subjected to a constant flow of matter and energy from the
outside. Because this flow must also exit the system in question, that is, the
waste products need to be dissipated, this systems are called "dissipative". 7

For our purposes here what matters is that once a continuous flow of matter-
energy is included in the model, a wider range of possible forms of dynamic
equilibria becomes possible. The old static stability is still one possibility,
except that now these equilibrium points are neither unique nor optimal (and
yet they are more robust that the old equilibria). Non-static equilibria also
exist, in the form of cycles, for instance. Perhaps the most novel type of
stability is that represented by "deterministic chaos", in which a given
population can be pinned down to a stable, yet inherently variable,
dynamical state. These new forms of stability have received the name of
"attractors", and the transitions which transform one type of attractor into

http://www.t0.or.at/delanda/delanda.htm (5 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

another have been named "bifurcations". Let's refer to the cluster of


concepts making up this new paradigm of stability as "nonlinear dynamics". 8

One of the most striking consequences of nonlinear dynamics is that any


population (of atoms, molecules, cells, animals, humans) which is stabilized
via attractors, will exhibit "emergent properties", that is, properties of the
population as a whole not displayed by its individual members in isolation.
The notion of an emergent or synergistic property is a rather old one, but for
a long time it was not taken very seriously by scientists, as it was associated
with quasi-mystical schools of thought such as "vitalism". Today, emergent
properties are perfectly legitimate dynamical outcomes for populations
stabilized by attractors. A population of molecules in certain chemical
reactions, for instance, can suddenly and spontaneously begin to pulsate in
perfect synchrony, constituting a veritable "chemical clock". A population of
insects (termites, for instance) can spontaneously become a "nest-building
machine", when their activities are stabilized nonlinearly.

Thus, the "searching-device" constituted by variation coupled to selection,


does not explore an unstructured space of possibilities, but a space "pre-
organized" by attractors and bifurcations. In a way, evolutionary processes
simply follow these changing distributions of attractors, slowly climbing from
one dynamically stable state to another. For example, since in this space
one possible outcome is a chemical clock, the searching-device could have
stumbled upon this possibility which in essence constitutes a primitive form
of a metabolism. The same point applies to other evolutionary stable
strategies, such as the nest-building strategy of the termites.

After this rather long introduction, we are finally in a position to understand


enterprises such as Artificial Life. The basic point is that emergent properties
do not lend themselves to an analytical approach, that is, an approach which
dissects a population into its components. Once we perform this dissection,
once the individuals become isolated from each other, any properties due to
their interactions will disappear. What virtual environments provide is a tool
to replace (or rather, complement) analysis with synthesis, allowing
researchers to exploit the complementary insights of population thinking and
nonlinear dynamics. In the words of Artificial Life pioneer, Chris Langton:

http://www.t0.or.at/delanda/delanda.htm (6 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

"Biology has traditionally started at the top, viewing a living organism as a


complex biochemical machine, and worked analytically downwards from
there - through organs, tissues, cells, organelles, membranes, and finally
molecules - in its pursuit of the mechanisms of life. Artificial Life starts at the
bottom, viewing an organism as a large population of simple machines, and
works upwards synthetically from there, constructing large aggregates of
simple, rule-governed objects which interact with one another nonlinearly in
the support of life-like, global dynamics. The 'key' concept in Artificial Life is
emergent behavior. Natural life emerges out of the organized interactions of
a great number of nonliving molecules, with no global controller responsible
for the behavior of every part. It is this bottom-up, distributed, local
determination of behavior that Artificial Life employs in its primary
methodological approach to the generation of life-like behaviors ". 9

The typical Artificial Life experiment involves first the design of a simplified
version of an individual animal, which must possess the equivalent of a set
of genetic instructions used both to create its offspring as well as to be
transmitted to that offspring. This transmission must also be "imperfect"
enough, so that variation can be generated. Then, whole populations of
these "virtual animals" are unleashed, and their evolution under a variety of
selection pressures observed. The exercise will be considered successful if
novel properties, unthought of by the designer, spontaneously emerge from
this process.

Depending on the point of view of the designer, these emergent properties


need to match those observed in reality, or not. That is, a current theme in
this field is that one does not have to be exclusively concerned with
biological evolution as it has occurred on planet Earth, since this may have
been limited by the contingencies of biological history, and that there is
much to be learned from evolutionary paths that were not tried out in this
planet. At any event, the goal of the simulation is simply to help "synthesize
intuitions" in the designer, insights that can then be used to create more
realistic simulations. The key point is that the whole process must be bottom-
up, only the local properties of the virtual creatures need to be predesigned,
never the global, population-wide ones.

Unlike Artificial Life, the approach of Artificial Intelligence researchers

http://www.t0.or.at/delanda/delanda.htm (7 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

remained (at least until the 1980's) largely top-down and analytical. Instead
of treating the symbolic properties they study as the emergent outcome of a
dynamical process, these researchers explicitly put symbols (labels, rules,
recipes) and symbol-manipulating skills into the computer. When it was
realized that logic alone was not enough to manipulate these symbols in a
significantly "intelligent" way, they began to extract the rules of thumb, tricks
of the trade and other non-formal heuristic knowledge from human experts,
and put these into the machine but also as fully formed symbolic structures.
In other words, in this approach one begins at the top, the global behavior of
human brains, instead of at the bottom, the local behavior of neurons. Some
successes have been scored by this approach, notably in simulating skills
such as those involved in playing chess or proving theorems, both of which
are evolutionarily rather late developments. Yet the symbolic paradigm of
Artificial Intelligence has failed to capture the dynamics of evolutionarily
more elementary skills such as face-recognition or sensory-motor control. 10

Although a few attempts had been made during the 1960's to take a more
bottom-up approach to modeling intelligence (e.g. the perception), the
defenders of the symbolic paradigm practically killed their rivals in the battle
for government research funds. And so the analytical approach dominated
the scene until the 1980's when there occurred a spectacular rebirth of a
synthetic design philosophy. This is the new school of Artificial Intelligence
known as "connectionism". Here, instead of one large, powerful computer
serving as a repository for explicit symbols, we find a large number of small,
rather simple computing devices (in which all that matters is their state of
activation), interacting with one another either to excite or inhibit each
other's degree of activation. These simple processors are then linked
together through a pattern of interconnections which can vary in strength.

No explicit symbol is ever programmed into the machine since all the
information needed to perform a given cognitive task is coded in the
interconnection patterns as well as the relative strengths of these
interconnections. All computing activity is carried out by the dynamical
activity of the simple processors as they interact with one another (i.e. as
excitations and inhibitions propagate through the network), and the
processors arrive at the solution to a problem by settling into a dynamical
state of equilibrium. (So far point attractors are most commonly used,

http://www.t0.or.at/delanda/delanda.htm (8 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

although some designs using cyclic attractors are beginning to appear.) 11

If there is ever such a thing as a "symbol" here, or rather symbol-using (rule-


following) behavior, it is as an emergent result of these dynamics. This fact
is sometimes expressed by saying that a connectionist device (also called a
"neuralnet") is not programmed by humans, but trained by them, much as a
living creature would. In the simplest kind of networks the only cognitive task
that can be performed is pattern association. The human trainer presents to
the network both patterns to be associated, and after repeated
presentations, the network "learns"to associate them by modifying the
strength of the interconnections. At that point the network can respond with
the second pattern whenever the first one is presented to it.

At the other end of the spectrum of complexity, multilayered networks exhibit


emergent cognitive behavior as they are trained. While in the simple case of
pattern association much of the thinking is done by the trainer, complex
networks (i.e. those using "hidden units") perform their own extraction of
regularities from the input pattern, concentrating in micro features of the
input which many times are not at all obvious to the human trainer. (In other
words, the network itself "decides" what traits of the pattern it considers as
salient or relevant.)

These networks also have the ability to generalize from the patterns they
have learned, and so will be able to recognize a new pattern that is only
vaguely related to one they have been previously exposed to. In other
words, the ability to perform simple inductive inferences emerges in the
network without the need to explicitly code into it the rules of a logical
calculus. These designs are also resilient against damage unlike their
symbolic counterparts which are inherently brittle. But perhaps the main
advantage of the bottom-up approach is that its devices can exhibit a degree
of "intentionality".

The term "intentionality" is the technical term used by philosophers to


describe the relation between a believer and the states of affairs his beliefs
are about. That is, an important feature of the mental states of human
beings and other animals (their beliefs and desires) is that they are about
phenomena that lie outside their minds. The top-down, symbolic approach to

http://www.t0.or.at/delanda/delanda.htm (9 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

Artificial Intelligence sacrifices this connection by limiting its modeling efforts


to relations between symbols. In other words, in the analytical approach only
the syntactic or formal relations between symbols matter (with the exception
of an "internal semantics" involving reference to memory addresses and the
like). Hence, these designs must later try to reconnect the cognitive device
to the world where it must function, and it is here that the main bottleneck
lies (unless the "world" in question is a severely restricted domain of the real
world, such as the domain of chess). Not so in the synthetic approach:

"The connectionist approach to modeling cognition thus offers a promise in


explaining the aboutness or intentionality of mental states. Representational
states, especially those of hidden units, constitute the system's own learned
response to inputs. Since they constitute the system's adaptation to the
input, there is a clear respect in which they would be about objects or events
in the environment if the system were connected, via sensory-motor organs,
to that environment. The fact that these representations are also sensitive to
context, both external and internal to the system, enhances the plausibility of
this claim that the representations are representations of particular states. "
12

So far, the abstract living creatures inhabiting the virtual environments of


Artificial Life have been restricted to rather inflexible kinds of behavior. One
may say that the only kinds of behavior that have been modelled are of the
genetically "hard-wired" type, as displayed by ants or termites. Yet adding
connectionist intelligence to these creatures could endow them with enough
intentionality to allow researchers to model more flexible, "multiple-choice"
behavior, as displayed by mammals and birds. We could then expect more
complex behavioral patterns (such as territorial or courtship behavior) to
emerge in these virtual worlds. Artificial Intelligence could also benefit from
such a partnership, by tapping the potential of the evolutionary "searching-
device" in the exploration of the space of possible network designs. The
genetic algorithm, which exploits this possibility, has so far been restricted to
searching for better symbolic designs (e.g. production rules).

Furthermore, having a virtual space where groups of intentional creatures


interact can also benefit other disciplines such as economics or political
science. A good example of this is Robert Axelrod's use of a virtual

http://www.t0.or.at/delanda/delanda.htm (10 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

environment to study the evolution of cooperation. His work also exemplifies


the complementary use of synthesis (to generate intuitions) and analysis (to
formally ground those intuitions). In the words of Douglas Hofstadter:

"Can totally selfish and unconscious organisms living in a common


environment come to evolve reliable cooperative strategies? Can
cooperation evolve in a world of pure egoists? Well, as it happens, it has
now been demonstrated rigorously and definitively that such cooperation
can emerge, and it was done through a computer tournament conducted by
political scientist Robert Axelrod. More accurately, Axelrod first studied the
ways that cooperation evolved by means of a computer tournament, and
when general trends emerged, he was able to spot the underlying principles
and prove theorems that established the facts and conditions of
cooperation's rise from nowhere." 13

The creatures that Axelrod placed in a virtual environment to conduct this


round-robin tournament were not full-fledged intentional entities of the type
envisioned above. Rather, the motivations and options of the creatures were
narrowly circumscribed by using the formalism of Game Theory, which
studies the dynamics of situations involving conflict of interest. In particular,
Axelrod's entities were computer programs, each written by a different
programmer, playing a version of the game called " Prisoner's Dilemma". In
this imaginary situation, two accomplices in a crime are captured by the
police and separately offered the following deal: If one accuses his
accomplice, while the other does not, the "betrayer" walks out free, while the
"sucker" gets the stiffest sentence. If, on the other hand, both claim
innocence and avoid betrayal they both get a small sentence. Finally, if both
betray each other, they both get a long sentence. The dilemma here arises
from the fact that even though the best overall outcome is not to betray
one's partner, neither one can trust that his accomplice won't try to get the
best individual outcome (to walk out free )leaving the other with the "sucker
payoff". And because both prisoner's reason in a similar way, they both
choose betrayal and the long sentence that comes with it, instead of loyalty
and its short sentence.

In the real world we find realizations of this dilemma in, for example, the
phenomenon known as "bank runs". When news that a bank is in trouble

http://www.t0.or.at/delanda/delanda.htm (11 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

first come out, each individual depositor has two options: either to rush to
the bank and withdraw his savings or to stay home and allow the bank to
recover. Each individual also knows that the best outcome for the
community is for all to leave their savings in the bank and so allow it to
survive. But no one can afford to be the one who loses his savings, so all
rush to withdraw their money ruining the institution in the process.
Hofstadter offers a host of other examples, including one in which the choice
to betray or cooperate is faced by the participants not once, but repeatedly.
For instance, imagine two"jungle traders" with a rather primitive system of
trade: each simply leaves a bag of goods at a pre defined place, and comes
back later to pick another bag, without ever seeing the trading partner. The
idea is that on every transaction, one is faced with a dilemma, since one can
profit the most by leaving an empty bag and sticking the other with the
"sucker payoff". Yet, the difference is that doing this endangers the trading
situation and hence there is more to lose in case of betrayal here. (This is
called the "iterated Prisoner's Dilemma").

Axelrod's creatures played such an iterated version of the game with one
another. What matters to us here is that after several decades of applying
analytical techniques to study these situations, the idea that "good guys
finish last" (i.e. that the most rational strategy is to betray one's partner) had
become entrenched in academic (and think tank) circles. For example, when
Axelrod first requested entries for his virtual tournament most of the
programs he received were "betrayers". Yet, the winner was not. It was
"nice" (it always cooperated in the first encounter so as to give a sign of
good faith and begin the trading situation), "retaliatory" (if betrayed it would
respond with betrayal in the next encounter) yet "forgiving" (after retaliating it
was willing to reestablish a partnership). As mentioned above, these were
not truly intentional creatures so the properties of being "nice, retaliatory and
forgiving"were like emergent properties of a much simpler design. Its name
was "TIT-FOR-TAT" and its actual strategy was simply to always cooperate
in the first move and there after do what the other player did in the previous
move. This program won because the criterion of success was not how
many partners one beats, but how much overall trade one achieves.

Because the idea that "good guys finish last" had become entrenched,
further analysis of the situation (which could have uncovered the fact that

http://www.t0.or.at/delanda/delanda.htm (12 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

this principle does not apply to the "iterated" version of the game), was
blocked. What was needed was to unblock this path by using a virtual
enviroment to "synthesize" a fresh intuition. And in a sense, that is just what
Axelrod did. He then went further and used more elaborate simulations
(including one in which the creatures replicated, with the number of progeny
being related to the trading success of the parent), to generate further
intuitions as to how cooperative strategies could evolve in an ecological
environment, how robust and stable these strategies were, and a host of
other questions. Evolutionary biologists, armed with these fresh insights,
have now discovered that apes in their natural habitats play a version of TIT-
FOR-TAT. 14

Thus, while some of the uses of virtual environments presuppose that old
and entrenched ideas (about essences or optimality) have been
superseded, these abstract worlds can also be used to synthesize the
intuitions needed to dislodge other ideas blocking the way to a better
understanding of the dynamics of reality.

Population thinking seems to have vanished "essences" from the world of


philosophy once and for all. Nonlinear dynamics, and more specifically, the
notion of an " emergent property" would seem to signal the death of the
philosophical position known as "reductionism" (basically, that all
phenomena can in principle be reduced to those of physics). It is clear now
that at every level of complexity, there will be emergent properties that are
irreducible to the lower levels, simply because when one switches to an
examination of lower level entities, the properties which emerge due to their
interactions disappear. Connectionism, in turn, offers a completely new
understanding of the way in which rule-following behavior can emerge from
a system in which there are no explicit rules or symbols what so ever. This
would seem destined to end the domination of a conception of language
based on syntactical entities and their formal relations (Saussure's signifiers
or Chomsky's rules). This conception (let's call it "formalism") has entirely
dominated this century, leading in some cases to extreme forms of linguistic
relativism, that is, the idea that every culture partitions the world of
experience in a different way simply because they use different linguistic
devices to organize this experience. If connectionism is correct, then
humanity does indeed has a large portion of shared experience (the basic

http://www.t0.or.at/delanda/delanda.htm (13 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

intentional machinery linking them to the world) even if some of this


experience can be casted in different linguistic form. 15

Furthermore, once linguists become population thinkers and users of virtual


environments, we could witness the emergence of an entirely different type
of science of language. For instance, about a millennium ago, the population
of Anglo-Saxon peasants inhabiting England, suffered the imposition of
French as the official language of their land by the Norman invaders. In
about two hundred years, and in order to resist this form of linguistic
colonialism, this peasant population transformed what was basically a soup
of Germanic dialects (with added Scandinavian spices) into something that
we could recognize as English. No doubt, in order to arrive at modern
English another few centuries of transformation would be needed, but the
backbone of this language had already emerged from the spontaneous labor
of a population under the pressure of an invading language. 16 Perhaps one
day linguists will be required to test their theories in a virtual environment of
interacting intentional entities, so that the rules of grammar they postulate
for a language can be shown to emerge spontaneously from the dynamics
of a population of speakers (instead of existing in a "synchronic"world,
isolated from the actual interactions of several generations of speakers).

Virtual environments may not only allow us to capture the fluid and changing
nature of real languages, they could also be used to gain insights into the
processes that tend to "freeze" languages, such as the processes of
standardization which many European languages underwent beginning in
the seventeenth century. Unlike the cases of Spanish, Italian and French,
where the fixing of the rules and vocabulary of the language were enforced
by an institution (e.g. an Academy), in England the process of
standardization was carried out via the mass publication of authoritative
dictionaries, grammars and orthographies. Just how these "linguistic
engineering" devices achieved the relative freezing of what was formerly a
fluid "linguistic matter", may be revealed through a computer simulation.
Similarly, whenever a language becomes standardized we witness the
political conquest of many "minority" dialects by the dialect of the urban
capital (London's dialect in the case of English). Virtual environments could
allow us to model dynamically the spread of the dominant dialect across
cultural and geographical barriers, and how technologies such as the

http://www.t0.or.at/delanda/delanda.htm (14 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

railroader the radio (e.g. the BBC) allowed it to surmount such barriers. 17

Future linguists may one day look back with curiosity at our twentieth
century linguistics, and wonder if our fascination with a static (synchronic)
view of language could not be due to the fact that the languages where
these views were first formulated (French and English), had lost their fluid
nature by being artificially frozen a few centuries earlier. These future
investigators may also wonder how we thought the stability of linguistic
structures could be explained without the concept of an attractor. How, for
instance, could the prevalence of certain patterns of sentence structure (e.g.
subject-verb-object, or "SVO") be explained, or how could bifurcations from
one pattern to another be modeled without some form of nonlinear
stabilization. (e.g. English, which may have switched over a millennium from
SOV to SVO). 18 Tomorrow's linguists will also realize that, because these
dynamic processes depend on the existence of heterogeneities and other
nonlinearities, the reason we could not capture them in our models was due
to the entrenchment of the Chomskian idea of a homogeneous speech
community of monolinguals, in which each speaker has equal mastery of the
language.

Real linguistic communities are not homogeneous in the distribution of


linguistic competence, and they are not closed to linguistic flows from the
outside (English, for instance, was subjected to large flows of French
vocabulary at several points in its evolution). Many communities are in fact
bilingual or multilingual, and constructive as well as destructive interference
between languages create nonlinearities which may be crucial to the overall
dynamics. As an example of this we may take the case of creole languages.
They all have evolved from the pidgins created in slave plantations, veritable
"linguistic laboratories" where the language of the plantation master was
stripped of its flourishes and combined with particles proceeding from a
variety of slave dialects. It is possible that one day virtual environments will
allow us to map the dynamical attractors around which these rapidly
developing Creole languages stabilized. 19

The discipline of sociolinguistics (associated with the work of linguists like


William Lavob) has made many of the important contributions needed to
purge the science of language from the classical assumptions leading to

http://www.t0.or.at/delanda/delanda.htm (15 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

"formalism", and move it closer to true population thinking. Indeed, the


central concern of sociolingustics has been the study of stylistic variation in
speech communities. This is a mechanism for generating diversity at the
level of speakers, and as such it could be dismissed as being exogenous to
language. Lavob, however, has also discovered that some of the rules of
language (he calls them "variable rules") can generate systematic,
endogenous variation. 20 This provides us with one of the elements needed
for our evolutionary "searching device".

Sociolinguists have also tackled the study of the second element selection
pressures. The latter can take a variety of forms. In small communities,
where language style serves as a badge of identity, peer pressure in social
networks can act as a filtering device, promoting the accumulation of those
forms and structures that maintain the integrity of the local dialect. On the
other hand, stigmatization of certain forms by the speakers of the standard
language (particularly when reinforced by a system of compulsory
education) can furnish selection pressures leading to the elimination of local
styles. Despite these efforts, formalism is still well entrenched in linguistics
and so this discipline cannot currently benefit from the full potential of virtual
environments. (Which does not mean, of course, that computers are not
used in linguistic investigations, but this use remains analytical and top-
down instead of synthetic and bottom-up.)

Just as linguistics inherited the homogeneous, closed space of classical


thermodynamics, as well as its static conception of stability, so did
mathematical economics. Here too, a population of producers and
consumers is assumed to be homogeneous in its distribution of rationality
and of market power. That is, all agents are endowed with perfect foresight
and unlimited computational skill, and no agent is supposed to exercise any
kind of influence over prices. Perfect rationality and perfect competition
result in a kind of society-wide computer, where prices transmit information
(as well as incentive to buy or sell), and where demand instantly adjusts to
supply to achieve an optimal equilibrium. And much as sociolinguists are
providing antidotes for the classical assumptions holding back their field,
students of organizations and of organizational ecology are doing the same
for the study of the economy. 21

http://www.t0.or.at/delanda/delanda.htm (16 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

Not only are economic agents now viewed as severely limited in their
computational skills, but this bounded rationality is being located in the
context of the specific organizations where it operates and where it is further
constrained by the daily routines that make up an "organizational memory".
In other words, not only is decision-making within organizations performed
on the basis of adaptive beliefs and action rules (rather than optimizing
rationality), but much of it is guided by routine procedures for producing
objects, for hiring/firing employees, for investing in research and
development an so on. Because these procedures are imperfectly copied
whenever a firm opens up a new plant, this process gives us the equivalent
of variable reproduction. 22 A changing climate for investment, following the
ups and downs of boom years and recessions, provides some of the
selection pressures that operate on populations of organizations. Other
pressures come from other organizations, as in natural ecosystems, where
other species (predators, parasites) are also agents of natural selection.
Here giant corporations, which have control over their prices (and hence are
not subjected to supply/demand pressures) play the role of predators,
dividing their markets along well defined territories (market shares).

As in linguistic research, computer simulation techniques have been used in


economics (e.g. Econometrics) but in many cases the approach has
remained analytic (i.e. top-down, taking as its point of departure macro-
economical principles). On the other hand, and unlike the situation in
linguistics, a bottom-up approach, combining populations of organizations
and nonlinear dynamics, is already making rapid progress. A notable
example of this is the Systems Dynamics National Model, at M.I.T. As in the
case of Artificial Life, one measure of success here is the ability of these
models to synthesize emergent behavior not planned in advance by the
model's designers. One dramatic example is the spontaneous emergence of
cyclic equilibria in this model with a period matching that of the famous
Kondratieff cycle.

That data from several economic indicators (G.N.P., unemployment rate,


aggregate prices, interest rates), beginning in the early nineteenth century,
display an unequivocal periodic motion of approximately 50 years duration,
is well known at least since the work of Joseph Schumpeter. Several
possible mechanisms to explain this cyclic behavior have been offered since

http://www.t0.or.at/delanda/delanda.htm (17 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

then, but none has gained complete acceptance. What matters to us here, is
that the M.I.T. model endogenously generates this periodic oscillation, and
that this behavior emerged spontaneously from the interaction of
populations of organizations, to the surprise of the designers, who were in
fact unaware of the literature on Kondratieff cycles. 23

The key ingredient which allows this and other models to generate
spontaneous oscillations, is that they must operate far-from-equilibrium. In
traditional economic models, the only dynamical processes that are included
are those that keep the system near equilibrium (such as "diminishing
returns" acting as negative feedback).The effects of explosive positive
feedback processes (such as"economies of scale") are typically minimized.
But it is such self-reinforcing processes that drive systems away from
equilibrium, and this together with the nonlinearities generated by imperfect
competition and bounded rationality, is what generates the possibility of
dynamical stabilization. 24

In the M.I.T. model, it is precisely a positive feedback loop that pushes the
system towards a bifurcation, where a point attractor suddenly becomes a
cyclic one. Specifically, the sector of the economy which creates the
productive machinery used by the rest of the firms (the capital goods
sector), is prone to the effects of positive feedback because whenever the
demand for machines grows, this sector must order from itself. In other
words, when any one firm in this sector needs to expand its capacity to meet
growing demand, the machines used to create machines come from other
firms in the same sector. Delays and other nonlinearities can then be
amplified by this feedback loop, giving rise to stable yet periodic behavior. 25

As we have seen, tapping the potential of the "epistemological reservoir"


constituted by virtual environments requires that many old philosophical
doctrines be eradicated. Essentialism, reductionism and formalism are the
first ones that need to go. Our intellectual habit of thinking linearly, where
the interaction of different causes is seen as additive, and hence global
properties that are more than the sum of the parts are not a possibility, also
needs to be eliminated. So does our habit of thinking in terms of
conservative systems, isolated from energy and matter flows from the
outside. Only dissipative, nonlinear systems generate the full spectrum of

http://www.t0.or.at/delanda/delanda.htm (18 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

dynamical forms of stabilization (attractors) and of diversification


(bifurcations).

In turn, thinking in terms of attractors and bifurcations will lead to a radical


alteration of the philosophical doctrine known as "determinism". Attractors
are fully deterministic, that is, if the dynamics of a given population are
governed by an attractor, the population in question will be strongly bound to
behave in a particular way. Yet, this is not to go back to the clockwork
determinism of classical physics. For one thing, attractors come in bunches,
and so at any particular time, a population that is trapped into one stable
state may be pushed to another stable state by an external shock (or even
by its own internal devices). In a way this means that populations have
"choices" between different "local destinies".

Moreover, certain attractors (called "strange attractors" or "deterministic


chaos"), bind populations to an inherently" creative" state. That is, a
population whose dynamics are governed by a strange attractor, is bound to
permanently explore a limited set of possibilities of the space of its possible
states. If a chaotic attractor is "small" relative to the size of this space, then it
effectively pins down the dynamics of a system to a relatively small set of
possible states, so that the resulting behavior is far from random and yet it is
intrinsically variable. Finally, as if this were not enough to subvert classical
determinism, there are also bifurcations, critical points at which one
distribution of attractors is transformed into another distribution. At the
moment this transformation occurs, relatively insignificant fluctuations in the
environment can have disproportionately large effects in the distribution of
attractors that results. In the words of Prigogine and Stengers:

"From the physicist's point of view this involves a distinction between states
of the system in which all individual initiative is doomed to insignificance on
one hand, and on the other, bifurcation regions in which an individual, an
idea, or a new behavior can upset the global state. Even in those regions,
amplification obviously does not occur with just any individual, idea, or
behavior, but only with those that are 'dangerous' - that is, those that can
exploit to their advantage the nonlinear relations guaranteeing the stability of
the preceding regime. Thus we are led to conclude that the same
nonlinearities may produce an order out of the chaos of elementary

http://www.t0.or.at/delanda/delanda.htm (19 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

processes and still, under different circumstances, be responsible for the


destruction of this same order, eventually producing a new coherence
beyond another bifurcation.". 26 P> This new view of the nature of
determinism may also have consequences for yet another philosophical
school of thought: the doctrine of "free will". If the dynamical population one
is considering is one whose members are human beings (for example, a
given human society), then the insignificant fluctuation that can become
"dangerous"in the neighborhood of a bifurcation is indeed a human
individual (an so, this would seem to guarantee us a modicum of free will).
However, if the population in question is one of neurons (of which the global,
emergent state is the conscious state of an individual) this would seem to
subvert free will, since here a micro-cognitive event may decide what the
new global outcome may be.

At any rate, the crucial point is to recognize the existence, in all spheres of
reality, of the reservoir of possibilities represented by nonlinear stabilization
and diversification (a reservoir I have somewhere else called "the machinic
phylum"). 27
We must also recognize that by its very nature, systems governed by
nonlinear dynamics resist absolute control and that sometimes the machinic
phylum can only be tracked, or followed. For this task even our modicum of
free will may suffice. The "searching device" constituted by genetic variation
and natural selection, does in fact track the machinic phylum. That is,
biological evolution has no foresight, and it must grope in the dark, climbing
from one attractor to another, from one engineering stable strategy to
another. And yet, it has produced the wonderfully diverse and robust
ecosystems we observe today. Perhaps one day virtual environments will
become the tools we need to map attractors and bifurcations, so that we too
can track the machinic phylum in search for a better destiny for humanity.

MANUEL DE LANDA

NOTES:

1)

http://www.t0.or.at/delanda/delanda.htm (20 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

Elliot Sober. The Nature of Selection. (MIT Press, 1987), pages 157-161.

2)

Steven Levy. Artificial Life. (Pantheon Books, NY 1992), pages 155-187.

3)

Richard Dawkins. The Selfish Gene. (Oxford University Press, Oxford1989),


page 12.

4)

Stuart A. Kauffman. Adaptation on Rugged Fitness Landscapes. In Daniel


Stein ed. Lectures in the Sciences of Complexity. (Addisson-Wesley 1989).

5)

Cynthia Eagle Russett. The Concept of Equilibrium in AmericanSocial


Thought. (Yale University Press, New Haven 1968). pages 28-54.

6)

John Maynard Smith. Evolution and the Theory of Games. In Did Darwin
Get it Right?: Essays on Games, Sex and Evolution.(Chapman and Hall, NY
1989).

7)

Ilya Prigogine and Isabelle Stengers. Order Out of Chaos. (Bantam Books,
NY 1984).

8)

Ian Stewart. Does God Play Dice: the Mathematics of Chaos. (Basil
Blackwell, Oxford 1989), pages 95-110.

9)

http://www.t0.or.at/delanda/delanda.htm (21 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

Chistopher G. Langton. Artificial Life. In Christopher G. Langton ed. Artificial


Life. (Addison-Wesley, 1988), page 2.

10)

Andy Clark. Microcognition: Philosophy, Cognitive Science and Parallel


Distributed Processing. (MIT Press, 1990), pages 61-75.

11)

J.A. Sepulchre and A. Babloyantz. Spatio-Temporal Patterns and Network


Computation. In A. Babloyantz ed. Self-Organization, Emergent Properties,
and Learning. (Plenum Press, NY 1991).

12)

William Bechtel and Adele Abrahamsen. Connectionism and the Mind.(Basil


Blackwell, Cambridge Mass. 1991). page 129.

13)

Douglas R. Hofstadter. The Prisoner's Dilemma and the Evolution of


Cooperation. In Metamagical Themas (Basic Books, NY 1985), page 720.

14)

James L Gould and Carol Grant Gould. Sexual Selection. (Scientific


American Library, NY 1989), pages 244-247.

15)

Donald E. Brown. Human Universals. (McGraw-Hill, NY 1991).

16)

John Nist. A Structural History of English. (St. Martin Press, NY 1966),


chapter 3.

http://www.t0.or.at/delanda/delanda.htm (22 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

17)

Tony Crowley. Standard English and the Politics of Language. (University of


Illinois Press, 1989).

18)

Winfred P. Lehmann. The Great Underlying Ground-Plans. In Winfred P.


Lehmann ed. Syntactic Typology. (Harvester Press, Sussex UK, 1978),
page 37.

19)

David Decamp. The Study of Pidgin and Creole Languages. In Dell Hymes
ed. Pidginization and Creolization of Languages. (Cambridge University
Press, Oxford UK 1971).

20)

William Labov. Sociolinguistic Patterns. (University of Pennsylvania Press,


Philadelphia 1971), pages 271-273.

21)

Michael T. Hannan and John Freeman. Organizational Ecology. (Harvard


University Press, 1989).

22)

Richard R. Nelson and Sidney G. Winter. An Evolutionary Theory of


Economic Change. (Belknap Press, Cambridge Mass. 1982), page14.

23)

Jay W. Forrester. Innovation and Economic Change. In Christopher


Freeman ed. Long Waves in the World Economy. (Butterworth, 1983), page
128.
http://www.t0.or.at/delanda/delanda.htm (23 z 25)2006-02-23 23:43:10
Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

24)

W. Brian Arthur. Self-Reinforcing Mechanisms in Economics. In Philip W.


Anderson, Kenneth J. Arrow and Daniel Pines eds. The Economy as an
Evolving Complex System. (Addisson-Wesley, 1988)

25)

J.D. Sterman. Nonlinear Dynamics in the World Economy: the Economic


Long Wave. In Peter L. Christiansen and R.D. Parmentier eds. Structure,
Coherence and Chaos in Dynamical Systems. (Manchester University
Press, Manchester UK, 1989).

26)

Ilya Prigogine and Isabelle Stengers. op. cit. page 190.

27)

Manuel De Landa. War in the Age of Intelligent Machines. (Zone Books, NY


1991).

http://www.t0.or.at/delanda/delanda.htm (24 z 25)2006-02-23 23:43:10


Zero News Datapool, MANUEL DE LANDA,THE EMERGENCE OF SYNTHETIC REASON

http://www.t0.or.at/delanda/delanda.htm (25 z 25)2006-02-23 23:43:10


frontwheeldrive.com: manuel de landa interview

interviews | reviews | lesson plans | staff | about

ILLogical Progression
Manuel De Landa

[by Paul D. Miller]

"the more consciousness is intellectualized, the more


matter is spatialized" Henri Bergson, "Creative
Evolution," 1911

What is history? Theories of the substance and form of history come and go
with the frequency of styles and trends - some are flashes across the cultural
landscape that show us for a brief moment another light to view the world
through, while others remix the ways we view time, instinct, and perceive the
warp and weave of the consensual social world around us. Our ideas about
continuity are woven from a strange cloth that for better or worse, is pretty
much anthropocentric, and the threads that hold it all together are, at best,
utterly tenuous. The way Proust or Tolstoy viewed history - compared to
Thucydides or Herodotus, or Hegel and Marx, is vastly different - let's not
forget about the 20th century where the works of people as diverse as Joyce
or Fernand Braudel, Carolyn Merchant or Sadie Plant, William S. Burroughs
and Rupert Sheldrake all share shelf space. For us living in our contemporary McKenzie Wark
hypermediated world of distributed networks and frequency exchanges, these
people come to us through the filter of media, a place where all expression
has a short shelf life, and at best almost all ideas are phantasms, fading Steven Shaviro
reputations - rumors of an "historical" existence most readers would be hard
pressed to really verify. Time itself sometimes seems to be debateable and
indeed, it often is. But where does Western style of history begin, and where,
Paul D. Miller
if there is a beginning, does it end? "Herodotus of Halicarnassus here
displays his story, so that human achievements may not become forgotten in
time, and great and marvellous deeds...may not be without their glory..." we
are told in one of the first accounts of epic history in the West. Back in the
5th century B.C. Herodotus was one of several historians that began the kind
of history we all take for granted today - events for him took place as an
interwoven account of events, dates, and the interactions of different
individuals in a "non-linear" mode - his was a kind of oral history where
words were like hypertext, each phrase a reference to countless other
references, and so on and so on. "I prefer to rely on my own knowledge, and
to point out who it was in actual fact that first [took action]...then I will
proceed with my history, telling the story as I go of small cities of men no
less than of great. For most of those which were great once are small today;
and those which used to be small were great in my own time. Knowing,
therefore, that human prosperity never abides long in the same place, I shall
pay attention to both alike...." His story began with a declaration of who he
was and what he was doing, and ended with a summary of the accounts of
the people he had given narrative.

Beginning, middle, and end - all were signposts within a latticework of


actions and deeds, all took place within the oral tradition of a culture where
things were hybrid, and created to keep a register of the dramas of the past.
It wasn't until the rise of different historical methods that Herodotus was
"formalized" and ossified. This is where Manuel Delanda's work becomes
extremely relevant - it shows us how to look outside the frame.

Back in 1992 Art critic and historian Thomas McEvilley wrote in his classic

http://frontwheeldrive.com/manuel_de_landa.html (1 z 5)2006-02-23 23:48:37


frontwheeldrive.com: manuel de landa interview

essay "One Culture of Many Cultures" as part of his larger anthology of


2 writings "Art and Otherness: Crisis in Cultural Identity" that "the visual arts
have a lobal social importance today that is quite independent of formalist
notions of esthetic presence. A culture's visual tradition embodies the image
it has of itself. Cumulatively, and with cross-currents, art draws into visibility
from the depths of intuition a culture's sense of its identity and of its value
and place in the world. Seen this way, art - or visual expression or whatever
we want to call it - encompasses far more than esthetics. There are times
when the issue of identity is muted and the esthetic voice speaks loudest. But
not today. Right now the issue of identity has come to the foreground both of
culture in general and of the visual in particular. This is not a stylistic fad; it
involves the deepest meanings of what we call history..." In a sense, Delanda
could be the philosopher of the esthetic and identity issues that McEvilley
was engaged with - with the exception that Delanda's work encompasses
stuff that leaves human beings far behind. History with a capital "H" in the
West has been plagued with problems of a deeply "teleological" nature: the
workers paradise, Aquina's city of God, Marx and his gang, etc etc it all
seems pretty old hat in a world of ditributed networks and cybernetic modes
of production that make almost all human activity seem like faint shadows of
the present moment. The density of it all is pretty thick - and the weight of
these kinds of viewpoints has distorted almost every aspect of how we view
the world around us. "History without agency!! How is such a thing
possible!!?" Most theorists involved with contemporary debates about culture
and contemporary technology would ask. My reply would simply be that
Delanda provides frameworks to view human action within a context of the
complete environment that humans inhabit: something most historians, in an
echo of the infamous 19th century German historian Leopold von Ranke's
credo of history as " wie es eigentlich gewesen" ("simply to show how it
really was") reducing history as merely festishisms of facts and documents -
the detritus of human action as the singular teleogical narrative (of course,
European humans, that is..) within to view the "progress" of the world. What
"A Thousand Years" basic premise does is provide a non-anthropcentric view
of the developments of the last millenium's "progression" in a "non-
teleological" context - the tableau that Delanda want's to depict is a place
where "geological, organic, and linguistic materials will all be allowed to 'have
their say'." Is history a thought contagion - a kind of memetic shock or
dispersion of cultural mores that are relative to each era and perspective of
tastes of continuity? It's a difficult question to answer, and there are no
straight forward responses to the enquiry. To look at the topic is to disrupt it
with your ideas of what holds consensual reality together, and at best, in our
late 20th century post everything mode of existence, you'll have a
kaliedoscope perspective. At worst, you won't remember anything at all of
what's happened in the last several centuries (this happens a lot in America).
Manuel De Landa is a contemporary philosopher of history -- but we live in a
time where most of the basic tenets of how we look at the world of the past
will shape how we can create new forms of thought in the future - this
condition leaves him in a strange, nebulous place in todays cultural
landscape. His work is extremely cogent, and well thought out, and his new
book "One Thousand Years of Non-Linear History" should be a classic
following in the steps of Jameson, Fernand Braudel, and the ubiquitous
Dynamic duo, Deleuze and Guattari -- all of whom have helped pave the way
for new approaches to philosophy and culture to interact.

Manuel de Landa writes from a strange pataphysical world of disjunctions


and fluid transitions - a milieu where writing about ideas becomes a fluid
dialectic switching from steady state to flux and back again in the blink of an
eye, or the turn of a sentence. His style of thinking is a like a landscape
made ogcrsytalline structres of the world of rocks and lavas, magmas and
tectonic plates that dance beneath our feet at every moment. And that
doesn't even get to the shifting magnetic polarities of the planet and the
solar winds and celestial movements that surround our little third stone from
the sun. "One Thousand Years" is kind of like a cognitive labyrinth of false
starts and dead ends, like an M.C. Escher painting or even more accurately,
like Hieronymous Bosch's carnival tableaux bereft of characters - the mise-en-
scene displaces the actors operating within it, and subsumes their identity - a

http://frontwheeldrive.com/manuel_de_landa.html (2 z 5)2006-02-23 23:48:37


frontwheeldrive.com: manuel de landa interview

shift in perspective takes place, creating a world where words act as a bridge
3 acorss broken and fractured "times" that exist pretty much simulatneously.
Delanda has given an account of an overview of what he calls "historic
materialism" and how the basic processes of the full environment we live in
have shaped contemporary thought.

I have to say it -- writing about Manuel Delanda's work is difficult. It isn't the
fact that his work is extremely well researched (it is), or the fact that much of
it involves extremely precise investigations into different realms of theoretical
approaches to the way we humans live and think in different multiplex
contexts that themselves are part question part unanswerable - for lack of a
better word - motif. With Manuel Delanda, there's always that sense that one
thing leads to another and basically there is no discrete and stable form of
inquiry: the question changes and configures the answer which again,
reconfigures what was originally asked. And so the loops go on. Delanda's
first book "War in the Age of Intelligent Machines" was an instant classic; it
encapsulated what so many different theorists were trying (without much
success) to achieve - an overview of our time and the different historical
cybernetic developments that created the milieu we live in. Call it morphic
resonance, or material convergence, or hermeneutical fusion, yada yada
yada, but you get the basic idea: that the "real" of the kind that theorists and
philosphers such as Heidegger (of "The Age of the World Picture" essay
fame) and a whole cast of people supporting the ideas and extensions of
European rationality like Francis Fukuyama, and Frederic Jameson have not
only been cripplling and distorting frameworks to view human history from,
but have also divorced us from the physical processes of the world of flux
and constant change that we are immersed in. Written from the viewpoint of
a cybernetic historian, De Landa's "War in the Age of Intelligent Machines"
created a sense of "historian as actor as philosopher:" the history of and
relationships between humans and the machines we use to create our
cultures were transformed into a continuum where the line dividing the
organic from the inorganic blurs, and the end result is something altogether
completely hybrid. "One Thousand Years of Nonlinear History" takes up
where "War..." left off: with a critique this time, of the material processes
embedded in the migrations of not only human cultres, but of the geological,
biological, linguistic and memetic systems that have impacted on this planet
and its inhabitants over the last thousand years. Teleology, contemporary
constructions of identity, frameworks of philosophical investigation - in the
flow of time like the old Borges poem "The Hourglass" says: "all are
obliterated, all brought down By the tireless trickle of the endless sand. I do
not have to save myself - I too Am a whim of time, that shifty element..."

"Those who would speak with understanding should base themselves upon
what is common to all (things), just as a polis (does) upon its law - and much
more firmly: for all [=political] laws nourish themselves upon one, the divine
(law); for it prevails as far as it will and is sufficient for all and is still left
over." -- Heraclitus of Ephesus, fragment 114, 6th century B.C.

frontwheeldrive: In A Thousand Years Nonlinear History you point out


that "human history is a narrative of contingencies not necessities,
of missed opportunities to follow different routes of development."
My question is this: history is always a framework of interpretation;
do you feel that somehow we have moved into our frameworks,
moved into the picture and lost the frame?"

Manuel De Landa: One of the ideas that I attack in my book is precisely the
primacy of "interpretations" and of "conceptual frameworks." Sure, ideas and
beliefs are important, and do play a role in history, but academics of different
brands have reduced all material and energetic processes, and all human
practices that are not linguistic or interpretative (think of manual skills, of
"know-how.") to a "framework." The 20th century has been obsessed with
positioning everything. Every culture, given that it has it's own framework of
beliefs, has become its own "world" and relativism (both moral and
epistemological) now prevails. But once you break away from this outmoded

http://frontwheeldrive.com/manuel_de_landa.html (3 z 5)2006-02-23 23:48:37


frontwheeldrive.com: manuel de landa interview

view, once you accept all the non-linguistic practices that really make up
4 society (not to mention the non-human elements that also shape it, such as
viruses, bacteria, weeds or non-organic energy and material flows like wind
and ocean currents) then language itself becomes just another material that
flows through a much expanded picture. Language, in my view, is best
thought of as a catalyst, a trigger for energetic processes (think of the words
"begin the battle" triggering an enormous and destructive process). The
question of "missed opportunities" is important, since for most of the
millennium both China and India had in fact a better chance to conquer the
world than did the West, so that the actual outcome ,a world dominated by
Western colonialism, was quite contingent. Things could have happened in
several other ways.

What are your thoughts on digital art and its relationship to the
different forms of communication in our dense and continuously
changing world. Is there any return to the "comforts" of a
"homogenous" culture on the horizon?

Here again we have two different answers depending on whether you believe
in "conceptual frameworks" or not. If you do, then you also believe that
there's such a thing as "the bourgeois ideology of the individual," a pervasive
framework within which all artistic production of the last few centuries is
inscribed. But if you do not believe there was ever such a thing, then history
becomes much less homogenous, much less dominated by any one
framework, and hence you begin to look at all the different ways in which art
has escaped the conditions of its production (which admittedly, did include
ruling classes as suppliers of resources). Put differently, once you admit that
history has been much more complex and heterogeneous than we have been
told, then even the "enemy" looks less in control of historical processes than
we thought. In a sense, what I am trying to do is to cut the enemy down to
size, to see all the potential escape routes that we have been overlooking by
exaggerating the importance of "frameworks" or "ideologies." Clearly, if the
enemy was never as powerful as we thought (which is not to say that it did
not to say that it did have plenty of power) the question of the role of art
(digital or otherwise) in changing social reality acquires new meanings and
possibilities.

How does your philosophy of history differ from those of previous


philosophers? Do you feel affinities with any contemporaries on this
subject? Deleuze and Guattari, maybe, with whom there's a sense
of continuous, vertiginous change - a tacit admission that history is
continuity, but seething, ebbing, and flowing continuity?

There are two main differences between my philosophical ideas about history
and those of previous philosophers. The first one, which is shared by many
these days, is a rejection of Platonic essences as sources of form, you know,
the idea that the form of this mountain here or of that zebra over there
emanates from an essence of "mountain-hood" or of "zebra-hood" existing in
some ideal world or in the mind of the God that created these creatures.
Instead, for each such entity (not only geological and biological entities, but
also social and economic ones), I force myself to come up with a process
capable of creating or producing such an entity. Sometimes these processes
are already figured out by scientists (in those disciplines linked to questions
of morphogenesis, like chaos theory and non-linear dynamics) and so I just
borrow their model, other times I need to create new models using
philosophical resources - and people like Deleuze and Guattari have been
very helpful in this regard. The other difference is my rejection of the
existence of totalities, that is, entities like "Western Society" of the "Capitalist
System." The morphogenetic point of view does allow for the emergence of
wholes that are more than the sum of their parts, but only if specific
historical processes, specific interactions between "lower scale entities," can
be shown to have produced such wholes. Thus, in my view, institutional
organizations like bureaucracies, banks, and stock markets acquire a life of
their own from the interactions of individuals. From the interactions of those

http://frontwheeldrive.com/manuel_de_landa.html (4 z 5)2006-02-23 23:48:37


frontwheeldrive.com: manuel de landa interview

institutions cities emerge, and from the interactions between cities nation
5 states emerge. Yet, in these bottom-up approaches, all the heterogeneity of
real nation states can be pockets of minorities, the dialect differences, the
local transience - unlike when history is modeled on totalities (concepts like
"society" or "culture" or "the system"). In this latter situation homogeneity
has to be artificially injected into the model.

One thing everyone seems to agree on is that there are so many


different frameworks of interpretation available today that we have
lost track of the world we inhabit: the "natural" has been displaced
by the human; we as a species have altered the atmosphere of the
planet, changed the composition of the oceans, even created
seismic disruptions. There's an overwhelming sense of
anthropocentric agency, over determination: "There is nothing that
man hath not wrought." How do you think this sense of ƒber-
agency so prevalent in philosophical, historical, and political
discourse will change in the future?

I agree that the domination of this century by linguistics and semiotics (which
is what allows us to reduce everything to talk of "frameworks of
interpretation"), not to mention the post-colonial guilt of so many white
intellectuals which forces them to give equal weight to any other culture's
belief system, has had a very damaging effect, even on art. Today I see art
students trained by guilt-driven semioticians or post-modern theorists, afraid
of the materiality of their medium, whether painting, music, poetry or virtual
reality (since, given the framework dogma, every culture creates its own
reality). The key to break away from this is to cut language down to size, to
give it the importance it deserves as a communications medium, but to stop
worshipping it as the ultimate reality. Equally important is to adopt a hacker
attitude towards all forms of knowledge: not only to learn UNIX or Windows
NT to hack this or that computer system, but to learn economics, sociology,
physics, biology to hack reality itself. It is precisely the "can do" mentality of
the hacker, naive as it may sometimes be, that we need to nurture
everywhere.

[from DJ Spooky.com ]

interviews | reviews | lesson plans | staff | about

© 2004 frontwheeldrive.com

http://frontwheeldrive.com/manuel_de_landa.html (5 z 5)2006-02-23 23:48:37


Manuel De Landa, Doors 2-Page1 - Mediamatic - Doors of Perception Conference -

Manuel DeLanda...Meshwork or Hierarchy?...Doors 2

MANUEL DE LANDA

Homes:
Meshwork or Hierarchy?

Speakers | Conference Report | Doors 2 | Mediamatic Index

.
.

Imagine having just landed a corporate job which


demands that you move to a new city. In this urban
enviroment the corporation has already found you an
apartment and, following the tradition of its great
corporate culture, it has had it decorated so that it embodies the aesthetic
and functional values for which the firm has become famous. No doubt,
when you finally move to this new place it simply won't feel like home,
more like a hotel suite, despite the fact that it offers you shelter and even
luxuries that you did not enjoy before. Does this lack of `home feeling'
stems from the fact that everything around you has been planned to the last
detail? Would it feel homier if you shared the corporate values that
informed the planning? Wouldn't you have to live for a while in this place,
interacting with its walls and table surfaces by placing a souvenir where, a
momento there, before something like a sense of home began to emerge?

file:///C|/Documents%20and%20Settings/User/Pulpit/De%20Landa%20Homes%20Meshworks.htm (1 z 3)2006-02-24 13:53:32


Manuel De Landa, Doors 2-Page1 - Mediamatic - Doors of Perception Conference -

These questions can also be raised even if we


eliminate from our scenario the intrusive presence of
an outside planner. Would a place feel like home if
every expressive or functional detail had been
exhaustively planned by yourself? No doubt all of us think about the
decoration of our home enviroment, but do we always have an explicit
reason why certain things are placed where they are? Don't we often place
them in a given location because it feels like that is where they belong, as if
our souvenirs and sentimental possesions arranged themselves through us?

Answering these questions in the case of human


beings is rather hard because of the extreme
variability of human culture and, even within a given
culture, the great diversity of human personalities.
Besides, I am not aware of any systematic study of these questions
regarding human homes. We do have some information, however, about the
creation of home territories by certain species of animals which throw some
light on the question `Are homes planned or self-organized?'. In particular, I
would like to begin my exploration of these issues with a brief examination
of bird territories and the role that the expressive qualities of song and color
play in their formation.

When the question of how birds create a home


territory was first raised (by ethologists like Lorenz
and Tinbergen) the answer given to these questions
was `Homes are planned', with the remaining
controversy gravitating around the issue of `Who does the planning', genes
or brains. Are the planned strategies pieced together by genetic evolution or
are they learned in the bird's lifetime?. In either case, the formation of a
home territory was seen to derive from an internal territorial drive or
instinct, with a precise central location in the brain. Out of this `territorial
center' commands would be then be issued to other centers in the brain (a
nesting center, a courtship center) and out of this hierarchical mental
structure a correct set of actions would then be implememted and the

file:///C|/Documents%20and%20Settings/User/Pulpit/De%20Landa%20Homes%20Meshworks.htm (2 z 3)2006-02-24 13:53:32


Manuel De Landa, Doors 2-Page1 - Mediamatic - Doors of Perception Conference -

borders of the territory would then be appropriatedly marked.

More recently, however, this line of thinking has been


increasingly critizised. Philosopher Daniel Dennet,
for example, has convincingly argued that to
postulate `brain centers' is to simply move all the
original questions about an animal's behaviour to an `animalculus' inside the
head. Unless this animalculus is `stupid' enough that it does not need to
interpret representations or perform other complex cognitive functions, we
are simply answering one question (How are territories organized by an
animal) with another one of equal complexity. (How are territories
organized by an animalculus). Philosophers Gilles Deleuze and Felix
Guattari have raised essentially the same point, adding that home territories
should be conceived not as emanating from an internal drive but as
emerging from the interaction of a non-hierarchical set of brain functions
and the expressive qualities of the territorial markers themselves, for
instance, the color of certain leaves or stems which some birds use to attact
females, or the musical properties of bird songs, or even faeces or urine
scented with the excretions of special glands.

Speakers | Conference Report | Doors 2 | Mediamatic Index | Next


Page > > >

Last updated: 16 feb1995

file:///C|/Documents%20and%20Settings/User/Pulpit/De%20Landa%20Homes%20Meshworks.htm (3 z 3)2006-02-24 13:53:32


Manuel De Landa: Deleuze and the Use of the Genetic Algorithm in Architecture

Deleuze and the Use of the Genetic Algorithm in Architecture

Manuel De Landa (2001)

The computer simulation of evolutionary processes is already a well established technique for the study
of biological dynamics. One can unleash within a digital environment a population of virtual plants or
animals and keep track of the way in which these creatures change as they mate and pass their virtual
genetic materials to their offspring. The hard work goes into defining the relation between the virtual
genes and the virtual bodily traits that they generate, everything else—keeping track of who mated with
whom, assigning fitness values to each new form, determining how a gene spreads through a population
over many generations is a task performed automatically by certain computer programs collectively
known as “genetic algorithms”. The study of the formal and functional properties of this type of
software has now become a field in itself, quite separate from the applications in biological research
which these simulations may have. In this essay I will deal neither with the computer science aspects of
genetic algorithms (as a special case of “search algorithms”) nor with their use in biology, but focus
instead on the applications which these techniques may have as aids in artistic design.

In a sense evolutionary simulations replace design, since artists can use this software to breed new forms
rather than specifically design them. This is basically correct but, as I argue below, there is a part of the
process in which deliberate design is still a crucial component. Although the software itself is relatively
well known and easily available, so that users may get the impression that breeding new forms has
become a matter of routine, the space of possible designs that the algorithm searches needs to be
sufficiently rich for the evolutionary results to be truly surprising. As an aid in design these techniques
would be quite useless if the designer could easily foresee what forms will be bred. Only if virtual
evolution can be used to explore a space rich enough so that all the possibilities cannot be
considered in advance by the designer, only if what results shocks or at least surprises, can genetic
algorithms be considered useful visualization tools. And in the task of designing rich search spaces
certain philosophical ideas, which may be traced to the work of Gilles Deleuze, play a very important
role. I will argue that the productive use of genetic algorithms implies the deployment of three forms of
philosophical thinking (populational, intensive, and topological thinking) which were not invented by
Deleuze but which he has brought together for the first time and made the basis for a brand new
conception of the genesis of form.

To be able to apply the genetic algorithm at all, a particular field of art needs to first solve the problem
of how to represent the final product (a painting, a song, a building) in terms of the process that
generated it, and then, how to represent this process itself as a well-defined sequence of operations. It is
this sequence, or rather, the computer code that specifies it, that becomes the “genetic material” of the
painting, song, or building in question. In the case of architects using computer-aided design (CAD) this
problem becomes greatly simplified given that a CAD model of an architectural structure is already
given by a series of operations. A round column, for example, is produced by a series such as this: 1)
draw a line defining the profile of the column; 2) rotate this line to yield a surface of revolution; 3)

file:///C|/Documents%20and%20Settings/User/Pulpit/Manu...of%20the%20Genetic%20Algorithm%20in%20Architecture.htm (1 z 5)2006-02-24 13:53:38


Manuel De Landa: Deleuze and the Use of the Genetic Algorithm in Architecture

perform a few “Boolean subtractions” to carve out some detail in the body of the column. Some
software packages store this sequence and may even make available the actual computer code
corresponding to it, so that this code now becomes the “virtual DNA” of the column. (A similar
procedure is followed to create each of the other structural and ornamental elements of a building.)

At this point we need to bring one of the philosophical resources I mentioned earlier to understand what
happens next: population thinking. This style of reasoning was created in the 1930’s by the biologists
who brought together Darwin’s and Mendel’s theories and synthesized the modern version of
evolutionary theory. In a nut shell what characterizes this style may be phrased as “never think in terms
of Adam and Eve but always in terms of larger reproductive communities”. More technically, the idea is
that despite the fact that at any one time an evolved form is realized in individual organisms, the
population not the individual is the matrix for the production of form. A given animal or plant
architecture evolves slowly as genes propagate in a population, at different rates and at different times,
so that the new form is slowly synthesized within the larger reproductive community.-1- The lesson for
computer design is simply that once the relationship between the virtual genes and the virtual bodily
traits of a CAD building has been worked out, as I just described, an entire population of such buildings
needs to be unleashed within the computer, not just a couple of them. The architect must add to the CAD
sequence of operations points at which spontaneous mutations may occur (in the column example: the
relative proportions of the initial line; the center of rotation; the shape with which the Boolean
subtraction is performed) and then let these mutant instructions propagate and interact in a collectivity
over many generations.

To population thinking Deleuze adds another cognitive style which in its present form is derived from
thermodynamics, but which as he realizes has roots as far back as late medieval philosophy: intensive
thinking. The modern definition of an intensive quantity is given by contrast with its opposite, an
extensive quantity. The latter refers to the magnitudes with which architects are most familiar with,
lengths, areas, volumes. These are defined as magnitudes which can be spatially subdivided: if one takes
a volume of water, for example, and divides it in two halves, one ends up with two half volumes. The
term “intensive” on the other hand, refers to quantities like temperature, pressure or speed, which cannot
be so subdivided: if one divides in two halves a volume of water at ninety degrees of temperature one
does not end up with two half volumes at forty five degrees of temperature, but with two halves at the
original ninety degrees. Although for Deleuze this lack of divisibility is important, he also stresses
another feature of intensive quantities: a difference of intensity spontaneously tends to cancel itself out
and in the process, it drives fluxes of matter and energy. In other words, differences of intensity are
productive differences since they drive processes in which the diversity of actual forms is produced.-2-
For example, the process of embryogenesis, which produces a human body out of a fertilized egg, is a
process driven by differences of intensity (differences of chemical concentration, of density, of surface
tension).

What does this mean for the architect? That unless one brings into a CAD model the intensive elements
of structural engineering, basically, distributions of stress, a virtual building will not evolve as a
building. In other words, if the column I described above is not linked to the rest of the building as a

file:///C|/Documents%20and%20Settings/User/Pulpit/Manu...of%20the%20Genetic%20Algorithm%20in%20Architecture.htm (2 z 5)2006-02-24 13:53:38


Manuel De Landa: Deleuze and the Use of the Genetic Algorithm in Architecture

load-bearing element, by the third or fourth generation this column may be placed in such a way that it
cannot perform its function of carrying loads in compression anymore. The only way of making sure that
structural elements do not lose their function, and hence that the overall building does not lose viability
as a stable structure, is to somehow represent the distribution of stresses, as well as what type of
concentrations of stress endanger a structure’s integrity, as part of the process which translates virtual
genes into bodies. In the case of real organisms, if a developing embryo becomes structurally unviable it
won’t even get to reproductive age to be sorted out by natural selection. It gets selected out prior to that.
A similar process would have to be simulated in the computer to make sure that the products of virtual
evolution are viable in terms of structural engineering prior to being selected by the designer in terms of
their “aesthetic fitness”.

Now, let’s assume that these requirements have indeed been met, perhaps by an architect-hacker who
takes existing software (a CAD package and a structural engineering package) and writes some code to
bring the two together. If he or she now sets out to use virtual evolution as a design tool the fact that the
only role left for a human is to be the judge of aesthetic fitness in every generation (that is, to let die
buildings that do not look esthetically promising and let mate those that do) may be disappointing. The
role of design has now been transformed into (some would say degraded down to) the equivalent of
a prize-dog or a race-horse breeder. There clearly is an aesthetic component in the latter two
activities, one is in a way, “sculpting” dogs or horses, but hardly the kind of creativity that one identifies
with the development of a personal artistic style. Although today slogans about the “death of the author”
and attitudes against the “romantic view of the genius” are in vogue, I expect this to be fad and questions
of personal style to return to the spotlight. Will these future authors be satisfied with the role of breeders
of virtual forms? Not that the process so far is routine in any sense. After all, the original CAD model
must be endowed with mutation points at just the right places (an this involves design decisions) and
much creativity will need to be exercised to link ornamental and structural elements in just the right
way. But still this seems a far cry from a design process where one can develop a unique style.

There is, however, another part of the process where stylistic questions are still crucial, although in a
different sense than in ordinary design. Explaining this involves bringing in the third element in
Deleuze’s philosophy of the genesis of form: topological thinking. One way to introduce this other
style of thinking is by contrasting the results which artists have so far obtained with the genetic
algorithm and those achieved by biological evolution. When one looks at current artistic results the most
striking fact is that, once a few interesting forms have been generated, the evolutionary process seems to
run out of possibilities. New forms do continue to emerge but they seem too close to the original ones, as
if the space of possible designs which the process explores had been exhausted.-3- This is in sharp
contrast with the incredible combinatorial productivity of natural forms, like the thousands of original
architectural “designs” exhibited by vertebrate or insect bodies. Although biologists do not have a full
explanation of this fact, one possible way of approaching the question is through the notion of a “body
plan”.

As vertebrates, the architecture of our bodies (which combines bones bearing loads in compression and
muscles bearing then in tension) makes us part of the phylum “chordata”. The term “phylum” refers to a
branch in the evolutionary tree (the first bifurcation after animal and plant “kingdoms”) but it also

file:///C|/Documents%20and%20Settings/User/Pulpit/Manu...of%20the%20Genetic%20Algorithm%20in%20Architecture.htm (3 z 5)2006-02-24 13:53:38


Manuel De Landa: Deleuze and the Use of the Genetic Algorithm in Architecture

carries the idea of a shared body-plan, a kind of “abstract vertebrate” which, if folded and curled in
particular sequences during embryogenesis, yields an elephant, twisted and stretched in another
sequence yields a giraffe, and in yet other sequences of intensive operations yields snakes, eagles, sharks
and humans. To put this differently, there are “abstract vertebrate” design elements, such as the tetrapod
limb, which may be realized in structures as different as as the single digit limb of a horse, the wing of a
bird, or the hand with opposing thumb of a human. Given that the proportions of each of these limbs, as
well as the number and shape of digits, is variable, their common body plan cannot include any of these
details. In other words, while the form of the final product (an actual horse, bird or human) does have
specific lengths, areas and volumes, the body-plan cannot possibly be defined in these terms but must be
abstract enough to be compatible with a myriad combination of these extensive quantities. Deleuze uses
the term “abstract diagram” (or “virtual multiplicity”) to refer to entities like the vertebrate body plan,
but his concept also includes the “body plans” of non-organic entities like clouds or mountains.-4-

What kind of theoretical resources do we need to think about these abstract diagrams? In mathematics
the kind of spaces in which terms like “length” or “area” are fundamental notions are called “metric
spaces”, the familiar Euclidean geometry being one example of this class. (Non-Euclidean geometries,
using curved instead of flat spaces, are also metric). On the other hand, there are geometries where these
notions are not basic, since these geometries possess operations which do not preserve lengths or areas
unchanged. Architects are familiar with at least one of these geometries, projective geometry (as in
perspective projections). In this case the operation “to project” may lengthen or shrink lengths and areas
so these cannot be basic notions. In turn, those properties which do remain fixed underprojections may
not be preserved under yet other forms of geometry, such as differential geometry or topology. The
operations allowed in the latter, such as stretching without tearing, and folding without gluing, preserve
only a set of very abstract properties invariant. These topological invariants (such as the dimensionality
of a space, or its connectivity) are precisely the elements we need to think about body plans (or more
generally, abstract diagrams.) It is clear that the kind of spatial structure defining a body plan cannot be
metric since embryological operations can produce a large variety of finished bodies, each with a
different metric structure. Therefore body plans must be topological.

To return to the genetic algorithm, if evolved architectural structures are to enjoy the same degree of
combinatorial productivity as biological ones they must also begin with an adequate diagram, an
“abstract building” corresponding to the “abstract vertebrate”. And it is a this point that design goes
beyond mere breeding, with different artists designing different topological diagrams bearing their
signature. The design process, however, will be quite different from the traditional one which operates
within metric spaces. It is indeed too early to say just what kind of design methodologies will be
necessary when one cannot use fixed lengths or even fixed proportions as aesthetic elements and must
instead rely on pure connectivities (and other topological invariants). But what it is clear is that without
this the space of possibilities which virtual evolution blindly searches will be too impoverished to be of
any use. Thus, architects wishing to use this new tool must not only become hackers (so that they
can create the code needed to bring extensive and intensive aspects together) but also be able “to hack”
biology, thermodynamics, mathematics, and other areas of science to tap into the necessary
resources. As fascinating as the idea of breeding buildings inside a computer may be, it is clear that
mere digital technology without populational, intensive and topological thinking will never be enough.

file:///C|/Documents%20and%20Settings/User/Pulpit/Manu...of%20the%20Genetic%20Algorithm%20in%20Architecture.htm (4 z 5)2006-02-24 13:53:38


Manuel De Landa: Deleuze and the Use of the Genetic Algorithm in Architecture

References

-1-

First....the forms do not preexist the population, they are more like statistical results. The
more a population assumes divergent forms, the more its multiplicity divides into
multiplicities of a different nature....the more efficiently it distributes itself in the milieu,
or divides up the milieu....Second, simultaneously and under the same conditions....
degrees are no longer measured in terms of increasing perfection....but in terms of
differential relations and coefficients such as selection pressure, catalytic action, speed of
propagation, rate of growth, evolution, mutation....Darwinism’s two fundamental
contributions move in the direction of a science of multiplicities: the substitution of
populations for types, and the substitution of rates or differential relations for degrees.

Gilles Deleuze and Felix Guattari. A Thousand Plateaus. (University of Minnesota Press, Minneapolis,
1987). Page 48.

-2-

Difference is not diversity. Diversity is given, but difference is that by which the given is
given...Difference is not phenomenon but the nuomenon closest to the phenomenon...
Every phenomenon refers to an inequality by which it is conditioned...Everything which
happens and everything which appears is correlated with orders of differences: differences
of level, temperature, pressure, tension, potential, difference of intensity

Gilles Deleuze. Difference and Repetition. (Columbia UniversityPress, New York, 1994). Page 222.

-3- See for example: Stephen Todd and William Latham. Evolutionary Art and Computers. (Academic
Press, New York, 1992).

-4-

An abstract machine in itself is not physical or corporeal, any more than it is semiotic; it is
diagrammatic (it knows nothing of the distinctions between the artificial and the natural
either). It operates by matter, not by substance; by function, not by form... The abstract
machine is pure Matter-Function—a diagram independent of the forms and substances,
expressions and contents it will distribute.

Gilles Deleuze and Felix Guattari. A Thousand Plateaus. Op. Cit. Page 141

file:///C|/Documents%20and%20Settings/User/Pulpit/Manu...of%20the%20Genetic%20Algorithm%20in%20Architecture.htm (5 z 5)2006-02-24 13:53:38


Table of Contents
A Thousand Marxes . . . . . . . . . . . . . . . . . . 1

i
A Thousand Marxes
ByEliot Albert

Eliot Albert challenges Manuel De Landa over his perfunctory dismissal of Marxism in his latest book

In the "Conclusions and Speculations" to his A Thousand Years of Nonlinear History1 Manuel De
Landa describes his book as offering a "historical survey of these flows of ’stuff’, as well as with the
hardenings themselves" (259). The stuff referred to is broken down in the structure of the book into
three sections, each corresponding to one of Gilles Deleuze and Felix Guattari’s three major strata, that
is, the "physicochemical, organic and anthropomorphic" (ATP 502/627)2, which for De Landa are
taken to be ’Lavas and Magmas’, ’Flesh and Genes’, and ’Memes and Norms’. The hardenings, or
elsewhere ’slowings down’ of this stuff, or matter-energy in movement, take different forms on the
three strata: the formation of features of the geological landscape by the hardening of the eponymous
lavas and magmas; the coagulation of the flows of biological matter, "biomass, genes, memes and
norms" (258) into human and animal bodies; the extrusion of languages from the "momentary slowing
downs or thickenings in a flow of norms", and the creation of institutions considered as "transitory
hardenings in the flows of money, routines and prestige" (259).

In executing this analysis De Landa builds a conceptual armature out of a weave of two elements. The
first is an historical perspective culled from Fernand Braudel’s magisterial Civilisation and
Capitalism: 15th-18th Century, the fine grain of the micrological secured by the perspective of the
longue durée and an attention to the cycles and flows of economic life: Kondratieff waves altered to
take account of nonlinear dynamics. The second, and perhaps more apparent, is a deployment of
concepts culled from the two volumes of Deleuze and Guattari’s Capitalism and Schizophrenia,in
particular from the second volume, A Thousand Plateaus. The Deleuzo-Guattarian concepts with
which De Landa is most concerned are those of de- and restratification, nonorganic life, the Body
without Organs, and the machinic phylum. The result of the fusion of these two elements, it is not
inconceivable to suggest, turns A Thousand Years into a sustained attempt to demonstrate the validity
of Guattari’s claim in his article "The Plane of Consistency" that, "what makes the thread of history -
from protohistory until the scientific revolutions - is the machinic phylum"3.

But lurking behind De Landa’s work is an unexpected attack upon, and rejection of, Marxism as a tool
of historical analysis. The only overt criticism of Deleuze and Guattari given in A Thousand Years is
precisely for their commitment to Marxism: "[d]espite the fact that their philosophical work represents
an intense movement of destratification, Deleuze and Guattari seem to have preserved their own
stratum, Marxism, which they hardly touch or criticise" (331). De Landa himself seems to want to
understand Marxism as a set of incontrovertible truths rather than as a method, and thus he forms his
criticisms, dismissing Marx on two counts: "the labour theory of value [...] and the built-in teleology in
the traditional Marxist periodisation of history" (281).

However, these are ancient attacks, and Marxism has long ceased to be bothered by them. Regarding
the first, Antonio Negri has written that the redundancy of the labour theory of value is tied "to a
previous and out-dated organisation of labour and accumulation" and goes on to say it is the
conjunction of "post-Fordism as the principal condition of the new social organisation of labour and as
the new model of accumulation, and post-Modernism as the capitalist ideology adequate to this new

1
mode of production" that together form the assemblage he calls "the real subsumption of society
within capital."4 And Felix Guattari pointed out that it is Marx himself in the Grundrisse who "insisted
on the absurdity and the transitional character of a measure of value based on work time". The simple
reason for this being the growing discrepancy between the machinic, intellectual and manual
components of labour which meant that "human time is increasingly replaced by machinic time."5

De Landa’s second reason for rejecting Marxism, that it is beholden to a predetermined teleological
progression of stages, is simply fatuous. The attack devolves to a position from which De Landa
criticises Marxism for being unable to countenance the fact of ’combined and uneven development’,
an idea of which De Landa has, in his recent talks at the ICA and the Architectural Association in
London, demonstrated a total and resounding ignorance. On this latter point he suggests that only his
model, nominally derived from the application of nonlinear dynamics to economic theory, can account
for a situation such as that in which one sees the existence of an urban agglomeration evincing
nominally capitalist relations of production surrounded by a rural expanse dominated by feudal
relations. It is, however, precisely such assemblages of heterogeneous elements that are accounted for
by the concept of combined and uneven development, as we find in Marxist analyses of the situation in
Tsarist Russia on the eve of the 1917 Revolution. Here, in Trotsky’s writings, is the classic statement
of the principle: "Unevenness, being the most general law of the historic process, reveals itself most
sharply and complexly in the destiny of the backward countries [...] From the universal law of
unevenness thus derives [...] the law of combined development - [...] an amalgam of archaic with more
contemporary forms"6.

Capitalism, contrary to De Landa’s claims, is never characterised in Marxist theory as a smooth space
of homogeneous relations, but rather is marked by the radical coexistence of unevenness. Marxist
economics at its most powerful, in contrast to bourgeois economics, is committed to an anti-Platonism
precisely in the sense that it rejects the possibility of the existence of pure forms - an evolutionary
procession of stages - and therefore is predicated upon the existence of noncapitalist relations.

The problem with De Landa’s position in A Thousand Years is that, by reading Marxism as he does, he
forces himself to reject an axiomatic part of the work of Deleuze and Guattari while simultaneously
claiming to be fully consistent with their project. Their account of capitalism’s spread across the globe
in terms of "(t)he four principal flows that torment the representatives of the world economy, or of the
axiomatic, [which] are the flow of matter-energy, the flow of population, the flow of food, and the
urban flow [...] the axiomatic never ceases to create all of these problems, while at the same time, its
axioms, even multiplied, deny it the means of resolving them" (ATP 468/584) is explicitly given in
terms derived from, and consistent with, the most sophisticated Marxist accounts of the functioning of
the capitalist world machine or axiomatic. One can only agree with Fredric Jameson’s judgement on
this: "Deleuze is alone among the great thinkers of so-called poststructuralism in having accorded
Marx an absolutely fundamental role in his philosophy"7. De Landa’s misreading of Marx thus
becomes, not an academic quibble, but a grotesque misrepresentation of Deleuze and Guattari’s work.

Why so? Because if we return to the question of the nature of the capitalist machine we have to ask
what is a machine to Deleuze and Guattari? And if we do this we find that inseparable from their
concept of the machine is the concept of machinic enslavement. Now, the Marxist concept of the

2
machine obviously has a direct correlate to this in that of exploitation, but anything even vaguely
resembling such a concept is entirely absent in De Landa; essentially he presents a concept of
capitalism (and indeed of all social regimens) that is devoid of conflict either endemic or accidental.

And the ultimate reason for this lack is his rejection of the theory of surplus value. Just as Negri notes
that "the theory of surplus value [...] is the centre, now and always, of Marxist theory" and the key to
demonstrating the "productive materialisation" of its method8, so the deformations and deployments
of surplus value (principally as surplus value of code, and in the distinction between machinic and
human surplus value) form a critically important element in the Deleuzo-Guattarian conceptual
assemblage. The transfiguration of the theory of surplus value as surplus value of code is to be
understood as the principal mechanism of Deleuzo-Guattarian thought, one that cuts across the strata
operating as follows: "Each chain captures fragments of other chains from which it ’extracts’ a surplus
value, just as the orchid code ’attracts’ the figure of a wasp: both phenomena demonstrate the surplus
value of a code" (ATP 39/47). The ramifications of this are broad, because in Deleuze and Guattari’s
hands the concept of the surplus value of code, the capture of code fragments, gives us first, the
principal mode of understanding deterritorialisation (decoding) processes, and second, the mechanism
whereby philosophy avoids representation (the goal of a nonrepresentational thought): "the wasp in
turn deterritorialises by joining with the orchid: the capture of a fragment of the code, and not the
representation of an image." Thus no theory of surplus value = no theory of machinic surplus value =
no concept of conflict = definitive break with Deleuze and Guattari.

NOTES
1. Manuel De Landa, A Thousand Years of Nonlinear History, Swerve Editions, New York 1997, pp.
257–74. References to this text will henceforth be designated by page number in Arabic numerals.
2. Gilles Deleuze and Felix Guattari, A Thousand Plateaus: capitalism and schizophrenia, trans. Brian
Massumi, Athlone Press, London 1988, pp. 502/627. References henceforth designated by ATP
followed by page numbers.
3. Felix Guattari, La revolution moleculaire, Editions Recherches, Paris 1977 , p. 315
4. Antonio Negri, Twenty Theses on Marx: Interpretation of the Class Situation Today, trans. by
Michael Hardt in Makdisi, Casarino and Karl (eds.) Marxism Beyond Marxism, Routledge, New York
1996, p. 154
5. Felix Guattari, "Capital as the Integral of Power Formations", trans. by Charles Wolfe and Sande
Cohen, in Soft Subversions, ed. by Sylvere Lotringer, Semiotext(e), New York 1996, pp. 202–24
6. Leon Trotsky, The History of the Russian Revolution, trans. by Max Eastman, Pathfinder Books,
New York 1980, p. 6
7. Fredric Jameson, "Marxism and Dualism in Deleuze", in A Deleuzian Century?, South Atlantic
Quarterly Special Issue Summer 1997, Vol. 93, No. 3, p. 395.
8. Antonio Negri, Marx Beyond Marx: Lessons on the Grundrisse, trans. by Harry Cleaver, Michael
Ryan, and Maurizio Viano, Pluto Press, London 1991, p. 160

You might also like