The user of the computer need never know what is inside. Two computers
can show identical behavior, but contain entirely different physical
parts. A common technique in the computer industry is to simulate
on an existing computer the logical behavior of a new machine that
has yet to be built, possibly using radically different constructional
principles. The user of a computer sees it, not as a collection
of electronic components, but as a logical network to be manipulated
by him. The useful properties of the network arise not so much from
the components themselves as from the way that they are arranged
and interconnected. This is what we mean by complexity, and in a
sense this is what entropy measures.
Our experience with computers in the past three decades has led
to the realization of an extremely significant principle: The
behavior and properties of any organized system arise not only from
its parts, but also from the manner in which they are arranged.
The whole is greater than the sum of the parts. The more complex
a system becomes, the more its behavior depends primarily on its
state of organization. A computer can be thought of as a childishly
simple and primitive model of a living organism - not because the
computer is "alive" in any sense, but because it illustrates at
a low level the importance of integration and organization in determining
the behavior of any complex system.