J. theor. Biol. (1987) 124, 111-121
Notes on Order and Complexity
G . KAMPIS AND V. CSANYI
Evolutionary Systems Group, Department of Behaviour Genetics,
L. E~tvrs University, //-2131 Gi~d, Jrvorka S. u. 14, Hungary
(Received 9 May 1986, and in revised form 7 August 1986)
We suggest that it is the organization of biological systems that is responsible for
the increase in their order and, through it, in their structural complexity. This increase
is shown to be nontrivial in the sense that a solution to the von Neumann complexity
paradox is proposed.
I. Introduction
We are living in a subjectively complex (i.e. "hard-to-understand") world. But it
is important to realize that this "complexity", whatever it may be, emerges
with biological existence in a double sense. It is a characteristic o f biological structures and behaviour, and can be observed and recognized only by beings such as
man. Thus it is reasonable to say that complexity is a very interesting feature of
biological systems and, as Saunders & Ho (1981) put it, a major challenge to
theoretical biology.
There are many ideas about complexity in the areas of thermodynamics, information theory, automata theory and others besides biology. This diversity is best
explained by a distinction made by Bunge (1963). He speaks of ontological complexity, as the complexity of material objects "out there", on the one hand, and of
semiotic complexity, the complexity of our descriptions which refer to the objects,
on the other. It seems probable that we cannot exhaust the ontological complexity
of any object because we never know any object perfectly; and in this respect an
atom can be as complex as the Universe.
Complexity is a vague and ill-defined concept, as established by many authors
(e.g. von Neumann, 1966; LiSfgren, 1977). There is a question as to whether it can
be properly defined, and thereby measured, or not (Rosen, 1977; Saunders & Ho,
1981). However, this question clearly refers to ontological complexity. Complexity
as a measure always comes with a frame of description, in which it is expressed.
Therefore, it is of semiotic nature (Kampis & Cshnyi, 1985). This means that
complexity measures do not refer to the objects directly, instead, they refer to sets
of signs with which the objects are described. In fact a semiotic complexity is always
measurable since our descriptions are finite and unambiguous. We can conclude
that it is possible to speak about biological complexity in a meaningful way, i.e. it
might be measured (and not just used as a metaphor), if we could find the proper
frames of description.
111
0022-5193/87/010111 + 11 $03.00/0
© 1987 Academic Press Inc. (London) Ltd
112
G. KAMPIS AND V. CS./~NYI
2. Background: Concepts of Complexity
The intuitive content of complexity (and from now on, we shall speak of semiotic
complexities) is something like the intricacy of a described structure. Certainly, this
property has something to do with the numerosity of parts, with the number of
different parts, and with the various patterns of arrangements of these parts. It is
known, however, that numerosity alone is not enough: for instance, some longer
sequences can be more transparent than a given shorter sequence. There are many
attempts to define complexity more suitably (for a survey, see L6fgren, 1977); there
are, for example, definitions based on statistical properties or on graph complexity.
Perhaps the best such definition is the one by Kolmogorov (1965, introduced
independently also by Chaitin, 1966), because it is mathematically neutral. It is the
minimal lengths of those programs which cause a universal Turing machine to print
out a given object (for example, to print out the detailed structural description of
a natural object such as a concrete DNA sequence).
Further, according to L6fgren (1977), we have to distinguish the complexity of
the description of an object (d-complexity) and the interpretation of an object, or
of its description (i-complexity). For the latter, an example is the sentence "The
least set which is a member of i t s e l f " - - i f interpreted, it certainly becomes more
complex than it is as a sequence of letters. Sometimes the converse is true, but the
distinction is important, as we shall see later. In short, the difference is that
d-complexity is dominated by the relationship of the observer to the object while
in i-complexity relationships of the given objects to other, external, objects also enter.
A basic property of complexity is that its amount is relative (Rosen, 1977; L6fgren,
1977; Saunders & Ho, 1981) to the viewpoint of the observer. This relativity has
two parts: both the abilities and the intentions of the observer are reflected in it.
Indeed, sometimes it is necessary to explain this background explicitly. More
formally, the relative complexity (to basis A) is the minimum length of those programs
which print out the object, if another program A is given for the same universal
machine (LiJfgren, 1972).
What enables us to choose between possible viewpoints, or description frames?
We have earlier argued (Karrtpis & Csfinyi, 1985) that the relevance of particular
viewpoints determines this choice. We can give a more precise definition here. Its
basis is the direction of curiosity of the observer, who decides in advance which
objects are different in their complexity and which are not. For example, a definition
which would put living organisms in the same class as stones is nonsense: in
complexity characterization, we try to describe their differences (note that such
differences do exist). A concept of complexity may be termed adequate if it satisfies
these needs of the observer, that is, if it answers the right questions.
On the other hand, we should follow a method of objectivity: that is, we must
consider all our knowledge of the object. If we neglect some properties which,
however, enter the examined phenomenon, then complexity becomes discretional
and we can find as high or as low values of complexity as we like (complexity is
in this case "in the eye of the observer" (Ashby, 1956)). In principle, we ought to
take our whole knowledge base into account, in order to check whether the chosen
NOTES
ON
ORDER
AND
COMPLEXITY
113
description is admitted. In practical terms, we can simply relate our present description to other established descriptions of the same object. Thus a proper concept of
complexity should be interpretable, that is, it should be linked to and checked against
what we already understand. Now we can say that a complexity characterization is
relevant if it is both adequate and interpretable.
3. Order, Organization and Complexity
After these preliminary remarks we may turn, more specifically, to the problems
of biological complexity and its possible origins. We shall argue that this question
is related to the recognition of order in living. First let us recall two different, related
views. Saunders & Ho (1976), although considering some kind of orderliness and
an evolutionary tendency towards increasing order (as opposed to the tendencies
described by the second law of thermodynamics) as the most apparent properties
of living organisms, argue that this order cannot itself be that which is increasing
in evolution; because this would lead to the formation of crystals and not of
organisms. Instead of order, they suggest that structural complexity (Saunders &
Ho, 1976) and "building complexity" (Saunders & Ho, 1981) increase primarily in
evolution, under the action of selection (for the refinement of this selectionist
argument see Saunders & Ho, 1981). The latter complexity term refers to the
complexity of objects, as seen by a particular observer: the builder of the system.
Because evolution manifests itself in a sequence of construction (i.e. build-up) of
new entities, the position of the builder is considered as the preferred description
frame by Saunders & Ho.
Wicken (1979) considered order as opposite to complexity, based on the observation that any order (or regularity) enables abbreviations in a description, the length
of which determines complexity. He suggested that structural complexity increases
due to thermodynamic principles and its increase enables the formation of organization.
Although these arguments are not necessarily invalid, a different approach will
be presented here. For motivation, first note that, although the choice of the builder
as the preferred observer is an attractive one, it gives rise to a serious problem. This
position implies that we have to determine the complexity of instructions which
then cause the system to behave in the given manner. To this end, it would be
necessary to know how hard it is for the system to do something. That is, this
complexity is relative to the knowledge of actual system transformations. These are,
however, generally unknown: we cannot write down the equations of motions of
biological systems that existed in the past, for instance.
Indeed, when it is said (Saunders & Ho, 1981) that the corresponding complexity
of a biological transition with probability p is given by log (1/p)--that is, the less
probable an outcome, the more complex it is--then this is not the complexity of
the instructions but that of the description of manifested actions. This description
is based on the observation of behaviour that has already taken place. Instead of
the d- and i-complexity of instructions, the d-complexity of the resulting process
was applied. There seems to be no simple way to solve this problem (see also later).
114
G. K A M P I S
AND
V. C S A N Y I
On the other hand, it might be accepted that in early prebiotic evolution it was
molecular heterogeneity generated due to thermodynamical reasons that led to an
increase in structural complexity (as suggested by Wicken); but this seems less likely
in more advanced stages of biological evolution.
But what is our original question about biological complexity? In evolution, with
the development of new specii, new and more sophisticated organs and other
functional units develop. Thus one of the basic differences among organisms is in
their functional construction--their morphology in terms of working units. There
is a dramatic change in this respect, from simple molecular systems to the cell, to
multicellular organisms, and eventually to mammals and man. It might be said that
this property should be reflected, or at least taken into account, in a model for
biological complexity.
It was proposed recently (Cs~nyi, 1981, 1982; Csfinyi & Kampis, 1985) that the
basic process of evolution is the construction of functional components in a network
which is essentially, but not perfectly, closed. This network was said to constitute
the organization of the system. Loosely, it is the totality o f roles components play
in the production and maintenance of other components. Such a role (or function)
is defined as the interdependence of components and component-producing processes. We suggested that the intuitive concept of biological organization can be covered
by this definition, and that these terms might serve as a basis for a possible language
of biology (for a discussion,,see the original sources). Note that here we have
higher-level components and descriptions (as opposed to microphysical ones). Now
there are reasons to think that such higher-level descriptions cannot be reduced to
more basic ones (Rosen, 1978; Kampis, 1986a, b). This means, we think, that
complexity has also to be determined in these terms, in order to fulfil criteria o f
adequacy.
Organization is a network that determines which components are produced and
what structure they have. For example, in a cell, it is the whole biochemical network
that produces certain sequences of amino acids but not others. We shall now argue
that this immediately gives rise to an order in the system.
What is order? One possible interpretation is that of Wicken (1979), identifying
it with reducibility of complexity. On the other hand, we can speak of order, in a
somewhat more general sense, as the opposite of chaos. Riedl (1979) defines order
as an observed conformity to some law in his basic book, and the Oxford Dictionary
says "the way in which things are placed in relation to one another". In this sense
we can say: the more lawful (i.e. the harder to describe, related to a law) the more
ordered. However, in our opinion, this is not in contradiction to the statement that
the complexity and order of binary sequences are opposite. As is known, a sequence
is considered as complex, or " r a n d o m " (Lrfgren, 1977; for partial review see Lov~isz,
1985) if its description cannot be shorter than the sequence itself, while it is called
ordered if it is compressible. Now the point is that the definitions of complexity
("randomness") and order are relative to a different basis in general. Structural
complexity is relative to an empty program or description. Order, as defined above,
is a complexity relative to the description o f a known law (and as a special case
we can also speak of order without reference to external laws beyond the ones
NOTES
ON ORDER
AND
COMPLEXITY
115
recognizable from a given structural pattern alone--this yields the usual order
concept for binary sequences). Thus a given sequence can be ordered and complex
at the same time.
This concept of order m a y be an important one. It expresses a regularity which
is not a mere coincidence or incidental state in a system; it is always committed to
essential features of objects. For instance, synergetics, the theory of physical "selforganizing" systems (Haken, 1977) uses the concept of order parameter (borrowed
from the theory of phase transitions). Order parameters are the most important
variables in synergetics. They are in close connection with the emergence of patterns
which are the result o f self-organization, and thus also in close relationship to
complexity.
In our case, a biological system consists of its components (in strictly structural
terms) and of its organization. In order to indicate our view on the relationship of
organization (in the sense of Cs~inyi & Kampis, 1985) to order and complexity in
these systems, a very simple formalism, suited to the present discussion will be used.
Let us denote components by s~ e S. The organization consists of transformations
of the components, and will be denoted as a family of functions f. For convenience,
let us suppose that every f takes two arguments which are components s~ and s~;
the generalization is straightforward. The value o f f is, say, a code n u m b e r telling
us which new components (there can be more than one) are produced from the
arguments.
In this framework we can define the structural complexity of a component, on
the basis of the Kolmogorov definition, as the ICI length of the shortest C Turingprogram which prints out its structure.
Orderliness of components can be determined as a measure which indicates how
determined they are with respect to the system, that is, whether their concrete
structure is important in the system transformations. This is related to the ability
of mappings to distinguish between different components. In principle, there can
be a different f for every different c o m p o n e n t (that is, for every structure), but for
structures having N elements of n type, their number can practically range from
zero to n N. Thus we can define order as the IRI length of the shortest R program
that prints out the structure of a c o m p o n e n t which is functionally equivalent to a
given component. More precisely, let R(si) be defined by the following property:
f(TR(s~), a)=f(si, a) for all a which belong to the domain o f f (where T means
the execution of the R program); but for any R': IR'(s,)l < IR(s,)l and for at least
one a already f( TR'(si), a) # f(s~, a).
It can be seen that R, as a measure for order, generates equivalence classes over
S, the set of possible components: exactly those components belong to the same
class, the f image of which is identical. The loose content of this measure is that
an even more condensed description would not print a c o m p o n e n t structure within
the same equivalence class.
This concept can be trivially generalized from components to sets of components,
to encompass the description of the whole system: the structure of all components.
With this, we can distinguish organized and organizationless states of the system.
Organization was said to be the network of individual functions, which come from
116
G. K A M P I S
AND
V. C S A N Y I
the interdependence of components and processes. This does not exist trivially.
Thus a completely organizationless system has n o f mappings at all, and consequently
no order. At the other extreme, a system with given components is the most organized
if every component has some specific function--obviously the network has the most
connections in this case. At this point every component forms a one-member
equivalence class, therefore there cannot be any shorter description than the detailed
structure of the component. This means R = C, i.e. in the most organized state the
whole structural complexity is expressed in order.
In summary, some part of, or, as a limit point, the whole of structural complexity
is determined by the organization. We think that the possible validity of presented
arguments is not affected by the simplicity of the formalism. In this way, organization
provides a linkage between order and complexity (see also Cs~nyi, 1986). This
relationship is very tight, in our view, as the following may indicate.
It was proposed that the basic tendency of evolution is that the system becomes
more and more organized (Csfinyi, 1981; Cs~nyi & Kampis, 1985). This process is
autonomous, so that the (not perfectly cycling) organization produces some new
components which may or may not then become members of the already established
organization (for a computer simulation, see Kampis & Csfinyi, 1986). The organization becomes more and more intricate, by the addition of new components.
However, this requires more control and more organization in order to maintain
the viability of the system (cf. Gardner & Ashby, 1970; Wicken, 1979), and this
necessarily results in an increase of order. But order cannot increase beyond the limit
of structural complexity and, in our view, this is what results in the increase in the
latter: in order to become more organized, the system has to become more ordered,
and in order to become more ordered, after a point it has to become more structurally
complex.
Therefore, structural complexity seems to be secondary to structural order (i.e.
the complexity of component structure relative to its participation in organization).
The picture sketched so far suggests that there is indeed an increase in both order
and structural complexity during evolution. However, there are statements (L6fgren,
1977; see also the debate in Kampis & Csfinyi, 1985) that complexity cannot increase
in any sequence of system transformations. We turn now to the question: is the
increase of order and complexity real, or only illusory as these statements imply?
In other words: is the present conception interpretable or not?
4. The von Neumann Complexity Problem
It was John yon Neumann who first formulated the question: Is evolution possible
in a formal system? To this end he introduced and studied cellular automata as
possible models of biological reproduction, growth, and evolution (von Neumann,
1966; Burks, 1971; see also Langton, 1984). The terminology introduced there will
be used throughout this section.
As a criterion for evolution, yon Neumann considered an increase of some kind
of complexity in the system. Although von Neumann never defined complexity (as
L6fgren (1977) points out), he supposed that it decreases in general if an automaton
NOTES
ON
ORDER
AND
COMPLEXITY
117
produces (constructs) another automaton in a cellular space or in the outside world.
The reason is that the first automaton has to contain some description or blueprint
of the second. In fact, all known machines produce simpler gadgets than they
themselves are. It was von Neumann's conjecture that there is a cut point below
which complexity decreases but above which it can also increase. This point is,
Neumann said, arrived at with the emergence of reproduction. Living creatures are
reproductive machines and they are capable of evolution--which is certainly not
degenerative in complexity if the term has any meaning. Motivated by this and by
other presumptions, Neumann has designed a computation and construction universal cellular space and gave a constructive proof for the existence of self-reproducing machines in this space.
However, the conjecture on complexity seems, on closer investigation, not wellfounded. If the constructor machine has to contain a description of the constructed
one, we have (in its simplest form) the following problem. Let us denote the minimum
description o f the constructed configuration p by D(p). Then the shortest description
of the constructor, whose configuration consists of the D(p) description (which it
has to contain), and of some additional part E, takes the form D(E+D(p))=
D(E')+D(p) for some E'. Now ID(E')+D(p)I>-ID(p)I because D(D(p)) is
already D ( p ) ; remember that D(p) was a minimal description and thus it cannot
be further shortened. Therefore, the constructor is at least as complex as the
constructed automaton, and their complexity is equal only in the special case of
self-reproduction, where D(p) becomes the description of the whole constructor.
That is, the relative complexity of E to D(p) is zero in this case. This is sometimes
thus interpreted so that reproduction is not a cut point in complexity but is, rather,
a limit point (LiSfgren, 1977); that is, complexity cannot increase but it can be, at
best, preserved.
It should be noted that, if this argument is correct for real systems, there is no
chance that a reproductive machine is formed during the transformations of the
system; except for the case when it is put into the system at the outset or if it is the
end product of some complexity-degenerating process. What, then, is evolution?
It is known (Myhill, 1964; L/Sfgren, 1972) that there is a tricky possibility of some
"increase" in complexity within a formal system like a cellular automaton. It is
understood in the special sense that certain reproductive machines can, besides
reproduction, print out theorems of another formal system; and further, their
descendants will print out more true theorems. It would be convenient to think that
here we have an example of emergence, a prerequisite of evolution: in a completely
understood system "new properties" appear. It is worthwhile to note that the printed
theorems need not be trivial or known at all. Therefore, here we are justified in
seeing an irreducible increase of complexity towards us: the set of theorems becomes
more and more complex.
Examining this situation, however, one can observe that this increase is completely
subjective. O f course, not the novelty of new theories is questioned, but the whole
situation, the relationship o f the observer to the observed, is queried. What we have,
precisely, is a machine which, besides reproducing (or printing out) its configuration
in the cellular space, prints out some other configurations which will be then
118
G.
KAMPIS
AND
V. C S A N Y I
interpreted as a text stating some theorems. An "increasing complexity" is assigned
to these configurations only because we interpret them by an external, discretional
procedure. This is a referential act, and yields i-complexity; while d-complexity is
always nonreferential (Lrfgren, 1977; Kampis, 1986b). But if we want to speak
about complexity (either in evolution or otherwise) we have to consider descriptions
of the system, and with it, d-complexity. (Let us recall Stanislaw Lem's Kyberiad
where an "information burglar" reads out messages from thermal motion--we have
the same situation here with the interpretation of configurations.) It can be easily
verified (details of which are not given here) that the d-complexity of the sequence
of these configurations does not increase. In summary, we have the unsatisfactory
and paradoxical situation at the moment that there seems to be no way in which
complexity could increase.
We have already noted that semiotic complexity can be misleading if the validity
of the given description frame is not carefully checked. Now we shall argue that
the quoted statements on abstract automata (by yon Neumann and others) are not
valid for real machines such as biological entities. That is, the semiotic complexity
of the former is not interpretable for real objects. Therefore, the description of the
latter isfree from the discussed paradox and enables the nontrivial increase of complexity.
We shall try to point out that there is a serious flaw in the quoted arguments that
led to a conclusion that any constructor is at least as complex as its products. Our
statement turns around the observation that the process of interpretation of a
description Dt (such as the process of construction that uses a description of the
object being constructed) and the description D2 of this very process are different
(cf. Lrfgren (1986) in a different context). Therefore, it becomes questionable
whether we can ensure the identity of D1 and /92 and hence of complexities 1911
and ID21 when referring to the same object. In particular, it is not obvious whether
abstract automata constructors and real constructors are the same in this respect.
Let us first examine D1. It was stated that DI(p) = D(p) is the minimal description
o f constructed configuration p, and that it is contained in the constructor. However,
this minimal description cannot equal C(p), the minimal structural description of
the configuration (i.e. the description relative to an empty program). This is so
because the constructor or, in general, the subsystem in which the D(p) description
is interpreted, "uses" the transition function of the system (as also recognized by
Pattee (1977)). D(p) must indicate how a given configuration can be built in the
simplest possible way in the given system--that is, D(p) is relative to the invariant
transition function of the system. It is important to note that, for a minimal
description, we cannot choose anything other than this. Whatever measure we
propose, by not taking this relativity into account, it can happen that the description
that is relative to the system is still shorter for a given configuration.
Let us now denote this description of the system (that is, of the invariant transition
function) by Ca. Now confronting C(p) with D(p) which is relative to C1, we can
first recognize that the criterion that the constructor has to contain D(p) of the
constructed configuration p is empty in terms o f C. This means, that C(e) of the
constructor e can be any small value compared to C(p) of the product; the only
criterion is that the first has to produce the second. Indeed, there are simple cellular
automata configurations which can construct quite intricate patterns (Wolfram,
NOTES
ON ORDER
AND
COMPLEXITY
119
1984). Rather, the relationship is the converse: C(e) has implications on D(p). If
we observe that one structure constructs another, we can deduce that D(p) of the
product cannot be greater than C(e) of the constructor (which might be more easily
determined).
To continue the examination of description of the construction process, our
original question now appears in the form: Which of the above complexities is real
(interpretable, in the discussed sense) in a biological, or other natural, system?
Complexity certainly reflects what we know or, more precisely, what we can know
about a system. This ensures objectivity (that is, nondiscretionality). Therefore, the
question in our case is whether we can know C1 in a real system? In the automata
context we clearly know it because we have designed (constructed, sic!) the system
and to this end we must possess its description. If we know C~ then there cannot
be any increase in the d-complexity of the system or of its configurations. However,
and this is the main point, if we cannot know it for some reason, we lose the basis
of relative description D(p), and thus it cannot be ensured that the d-complexity
D2 of a configuration p be as low as D(p), from which the constructor itself
works.
In this question we can recognize the problem of reductionism in some form: it
depends on whether it is possible to infer the laws of a system from the structural
description of its isolated objects. Namely, in a real system it is the description of
the constructor (description of present structure) that can be determined immediately, and from this, we would have to infer C~. However, in all probability this is
not possible (Rosen, 1978; for additional aspects see Kampis, 1986b). We find, that
the structural complexity C is objective in general, in the sense that there is no
effective way to reduce it by some inferred knowledge of the system transformations
before they actually occur. This means that D2 = C and therefore complexity can
increase nontriviaIly--there is no formal obstacle to the increase of pure descriptional
complexity in real systems.
Because in general we do not know C~ for a real system, we cannot mention it
when describing the system; it has to be replaced by instantaneous organization,
which is no more invariant but changes during subsequent constructions.
Our result is that the increase in structural d-complexity of a natural system and,
together with this, in its order (which is relative to the instantaneous organization),
is not illusory. Finally, however, it might be interesting to have a closer look at the
differences between cellular automata and natural systems.
In cellular automata (and in any equivalent formal system) the determination of
new configurations requires second-order predicates in general (Yasuhara, 1971).
Loosely this means that the transformations which are to be applied to the present
configuration may depend on it as an argument. The new configuration can be
determined by the (effective) evaluation of these expressions, which is possible only
if we can enumerate them. This (effective) enumeration tells us, among other facts,
which transformation belongs to which configuration.
In formal systems this enumerability is ensured by a G6del-numbering which
renders positive integers to every logical expression. In this enumeration it is C1
that is specified. When describing natural systems, however, it becomes known only
via the subsequent iterations of the construction process.
120
G. KAMPIS AND V. CS,~NYI
It was v o n N e u m a n n w h o first i n t r o d u c e d a f o r m a l a n a l o g u e o f c o n s t r u c t i n g a n d
r e p r o d u c i n g n a t u r a l systems, a n d the p r e v i o u s d i s c u s s i o n i m p l i e s t h a t in this
a n a l o g u e the n a t u r a l c o n f i g u r a t i o n s a n d t r a n s f o r m a t i o n s are e n u m e r a t e d in a w a y
s i m i l a r to a G r d e l - n u m b e r i n g , so that they can constitute a f o r m a l system. This
e n u m e r a t i o n , w h i c h m a p s real c o n s t r u c t o r s to a u t o m a t a c o n s t r u c t o r s , can t h e r e f o r e
be called a yon N e u m a n n numbering. It is this o p e r a t i o n that in c e l l u l a r a u t o m a t a
o b s c u r e s the o t h e r w i s e existing nontrivial c o m p l e x i t y - c h a n g e s , p o s s i b l y increases
in n a t u r a l systems. In o u r view, this is a b a s i c s o u r c e o f i n e q u a l i t y o f von N e u m a n n ' s
f o r m a l system a n d the s u i t a b l e d e s c r i p t i o n s o f b i o l o g i c a l systems, a n d also it is this
which r e n d e r s c o m p l e x i t y i n c r e a s e in the latter i n t e r p r e t a b l e .
A n o t h e r i m p o r t a n t o b s e r v a t i o n is that in d e s c r i p t i o n s o f n a t u r a l c o n s t r u c t o r s ,
unlike in d e s c r i p t i o n s o f c e l l u l a r a u t o m a t a , the s u b s e q u e n t c o n f i g u r a t i o n s are not
d e r i v a b l e in o n e a x i o m s y s t e m (from the " a x i o m s " o f C1). I n s t e a d , we c a n c o n s i d e r
the new c o n f i g u r a t i o n s as c o r r e s p o n d i n g to new a x i o m s which lead to the next
c o n f i g u r a t i o n . This m i g h t s h e d s o m e light on the c o n c e p t o f emergence, that might
n o w a p p e a r as a p o s s i b i l i t y for c o n s e c u t i v e c h a n g e s in an a x i o m system, which is
not i n t e r p r e t e d e x t e r n a l l y (cf. M y h i l l ' s a u t o m a t o n ) . I n s t e a d , it d e t e r m i n e s the state
c h a n g e s o f t h e s y s t e m itself. Such a x i o m s y s t e m s m i g h t exist n o n t r i v i a l l y b e c a u s e
we can gain k n o w l e d g e o f the closure o f the iterative p r o c e s s o f real c o n s t r u c t i o n s
o n l y t h r o u g h this very iteration. E x p l o r a t i o n o f these p r o p e r t i e s is left for future
papers.
REFERENCES
ASHBY, W. R. (1956). An Introduction to Cybernetics. London: Chapman and Hall.
BUNGE, M. (1963). The Myth of Simplicity. Englewood Cliffs, NJ: Prentice-Hall.
BURKS, A. W. (ed.) (1971). Essays on Cellular Automata. Urbana: University of Illinois Press.
CHAITIN, G. J. (1966). J. A C M 13, 547.
CSANYI, V. (1981). General Systems XXVI, 73.
CSANYI, V. (1982). General Theory of Evolution. Budapest: Hungarian Academy of Sciences.
CSANYI, V. (1986). Evolutionary Systems (in press).
CSANYI, V. & KAMPlS, G. (1985). Z theor. Biol. 114, 303.
GARDNER, M. & ASHBY, W. R. (1970). Nature 228, 784.
HAKEN, H. (1977). Synergetics: An Introduction. Berlin: Springer.
KAMPlS, G. (1986a). In: Cybernetics and Systems '86. (Trappl, R. ed.). pp. 36-42. Dordrecht: Reidel.
KAMPIS, G. (1986b). Some Problems of System Descriptions I-II. Int. J. Gen. Systems (submitted).
KAMPIS, G. & CS./~NYI,V. (1985). J. theor. Biol. 115, 467.
KAMPIS, G. & CSANYI, V. (1986). J. infer, deduct. Biol. (to appear).
KOLMOGOROV, A. N. (1965). Probl. Inf. Transm. 1, 1.
LANGTON, C. G. (1984). Physica 10D, 135.
LOVASZ,L. (1985). The Mathematical Notion of Complexity. Inst. fur Okonometrie u. Operations Research
preprint, University of Bonn.
LtSFGREN, L. (1986). B. Math. Biophys. 30, 415.
LOFGREN, L. (1972). In: Trends in GeneralSystems Theory (Klir, G. J. ed). pp. 340-407. New York: Wiley.
LOFGREN, L. (1977). Int. J. Gen. Systems 3, 197.
MYH1LL, J. (1964). In: Views on General Systems Theory (Mesarovic, M. ed.). pp. 106-118. New York:
Wiley.
NEUMANN, J. VON (1966). Theory o.fSe(f-Reproducing Automata (compiled by Burks, A. W.). Urbana:
University of Illinois Press.
PATTEE, H. H. (1977). Int. Z Gen. Systems 3, 259.
RIEDL, R. (1979). Order in Living Organisms. Dordrecht: Reidel.
ROSEN, R. (1977). Int. J. Gen. Systems 3, 227.
NOTES
ON ORDER
AND
COMPLEXITY
121
ROSEN, R. (1978). Fundamentals of Measurement and Representation of Natural Systems. New York:
North-Holland.
lAUNDERS, P. T. & Ho, M. W. (1976). J. theor. Biol. 63, 375.
lAUNDERS, P. Z. & HO, M. W. (1981). J. theor. Biol. 90, 515.
WICKEN, J. S. (1979). J. theor. Biol. 77, 349.
WOLFRAM, S. (1984). Nature 311, 419
YASUHARA, A. (1971). Recursive Function Theory and Logic. New York: Academic Press.
© Copyright 2026 Paperzz