Biocomplexity at all Levels of Biological Organisation
I am working on rewriting this (August 2022)
This theme concentrates on complexity and organisational scale. It concerns the way biological systems form as nested hierarchies of structure and the consequences of that. We all know the basic pattern: molecules; molecular interaction networks, cellular systems, cells, multicellular organisms with tissues, colonies, communities and the ecosystem. Which bits of that are genuine natural entities and which are just the way we have learned to think about biological structure, perhaps through convention? Is there something fundamental about nested hierarchy? If so is it just because large organisational structures only can form that way, or is there real function to the structuring (i.e. is more life-enhancing activity possible with this structure than without it)? Is there a continuity from molecule to ecosystem, or are biological levels real, discontinuous and distinct? Any way, what exactly is complexity and what does it have to do with hierarchical structure ?
Complexity and Emergence
In the reductionist scientific paradigm, we believe that any
supposed level of biological organisation (e.g. an organism) can be
explained entirely in terms of interactions among its component parts.
That is we explain all biology, including the behaviour of dogs, the
distribution of species on land and the anatomy of brains, in terms of
molecules and their chemical interactions (and of course can further
explain those with physics). It is noted, however, that some phenomena,
especially in biology, are (even in principle) inexplicable from
a knowledge of only the component parts, so reductionism is at best an incomplete answer.
This leads directly to the idea of emergence
- the appearance of phenomena from the organisational structure acting
as a whole. Emergence is central to most scientific definitions of
complexity and because we can now independently define emergence,
circular arguments using both can be avoided. Emergence
is easier to explain because it is a phenomenon with particular
identifiable properties, whilst complexity is an attempt to idendentify
the sort of system from which self-organisation spontaneously appears
from among the ineractions of component parts. Frankly, as an idea,
complexity still has many definitions and different meanings for
different branches of science. Happily this diversity of meaning has
been catalogued, reviewed by Ladyman, Lambert and Wiesner (2013), among
others and their work is elaborated in the (2020) book "What is a
Complex System?). Unhappily, that work is far from comprehensive and
to-date, scientific articles about complexity usually start by saying
"there is no consensus about the definition of complexity".
We will note in passing that there are several well established
statistical metrics of complexity such as Kolmogorov complexity (a
generally incomputable measure of information compressibility), its
algorithmic information theory 'children' including its aproximation
(Lempel-Ziv), Gell-Mann's 'effective complexity' and related Shannon
entropy based metrics such as mutual information, (among component
states), Kullback-Leibler divergence, the Bertschinger and Olbrich
(2006) measure of information closure, integrated information theory's
Phi (Oizumi et al. 2014) and offshoots such as logical depth and
thermodynamic depth (some of these are compared in application by
Albantakis (2021). In other words, lots of complicated maths to
quantify essentially the amount of coherent pattern forming from the
interaction of component parts. Notice I say complicated maths, not
complex - we need to distinguish between the two.
Complicated things may have
many components which interact in many ways, all at the same time, but
can be fully described (and therefore understood) through a strictly
reductionist analysis: breaking down the components and interactions
into a set of fundamental pieces and rules for interaction. A
good aproach with merely complicated systems is to abstract them as a
hierarchy of organisationally nested levels. Take for example a car:
engine, transmission and suspention, breaking system, electrics and
bodywork; each can be further broken down into e.g. disk break, disk,
pad, piston, etc. and each of these into components with a specific
shape and material, all interacting in ways that are set by the shape
and material (the component's form). Most scientifically based medicine
treats the human body that way too. The reason it works is that the
behaviour of higher levels in the notional hierarchy is fully
determined by the form and behaviour of its component parts.
Complex things are necessarily dynamic (can change in time) and embody information in their structure that influences the dynamics. But the defining characteristic is that new properties emerge from their internal organisation - properties that could not, even in principle, be predicted or understood solely in terms of their component parts. These properties must (like all properties) derive from embodied information - the problem of complex systems is that some of the information is 'hidden' in the whole-system level pattern of interactions.That is why the metrics mentioned above all attempt to quantify aspects of pattern (information) at the system-level.
We can classify things as members of the set of complex systems, members of the set of complicated systems and identify the kinds of order they map to: spontaneous order, fabricated order, or no order at all (a disordered jumble). Only life shares membership of both complex and complicated systems and accordingly it can generate both spontaneous and fabricated order: examples below.
The sets of complex and
complicated things map to spontaneous and fabricated order, but
complicated can also be a disordered jumble. Significantly, only life
belongs to both complex and complicated sets and is able to generate
both spontaneous order and fabricate order. It does so through its
substantial amounts of embodied information.
Spontaneous Order - Reyleigh-Bénard turbulence illustrated by Susanne Horn |
Fabricated Order - The Forth Bridge (credit unknown) |
Disorder - a rubbish pile photographed by Stefan Czapski, coutesy Geograph. CC-BY-SA-2.0 |
More defining of Complexity.
"Let us go back to the original Latin word complexus, which signifies 'entwined', 'twisted together'. This may be interpreted in the following way: in order to have a complex you need two or more components, which are joined in such a way that it is difficult to separate them. Similarly, the Oxford Dictionary defines something as 'complex' if it is "made of (usually several) closely connected parts". Here we find the basic duality between parts which are at the same time distinct and connected. Intuitively then, a system would be more complex if more parts could be distinguished, and if more connections between them existed."
This idea is especially relevant to living systems, which are readily interpreted as assemblies of different parts interacting through connections, many of which represent mutual dependencies, collectively making up a functioning whole. This applies equally well across the whole range of levels in biological organisation: from interactions among molecules, up to interactions between living processes and the non-living earth-systems for which the Gaia hypothesis is a potential explanation.
In my own view, there are two aspects of complexity. One is the 'richness' of relationships (entwinement): the inter-connectedness of the component parts, supremely elaborated in the brain, of course, but also in ecological communities: it is what Darwin referred to as a "tangled bank". The other aspect of complexity is the number and variety of different kinds of components that, so entwined, make up the whole. This number and variety is measured by diversity and in the biological context, that is of course biodiversity. That is why biodiversity has been given a theme within this project, and we should note that it is not just the number of species, but the number of all system components, including for example genes. In our interpretation it also quantifies the extent to which component parts are different and it includes the variety of interconnections as well. All these aspects can be quantified in terms of different kinds of entropy and as a consequence as information. To put it simply, any physical system is made up of components and the way they are interconnected - these are both quantifiable in terms of information because underlying both is pattern in configuration. The pattern may be called structure and it is this we look at next.
Nested levels of phenomena and emergence
We can start with the idea that everything that happens in the universe does so because patterns in the spatiotemporal configuration of matter and energy impose particular constraints on the general laws of physical forces (the idea presented on this page). So pattern really matters. Remember, a pattern is information (constraint to a single instantiation from all the ways a set of components could be configured). Given a set of patterns, it is possible to make a larger scale pattern from them. Such patterns of patterns, like the mosaic artwork of Anna Halm, show us how their arrangement can produce apparently novel effects at the larger scale that could never be identified from the smaller scale. This is because the arrangement at the larger scale embodies information that is not embodied by the component patterns. The larger scale information is itself a pattern, of course. This is very distinct from the way a jigsaw puzzle works. For that, the larger pattern is the result of particular formal constraints imposed by the form of the component parts. All the information needed to compose the larger scale pattern is already embodied by the individual parts. But in the art mosaic, formal - particular - constraints are additionally embodied at the scale of the larger pattern. That is the simple reason why strict reductionist science does not tell the whole story.
Because a large scale pattern can always be broken down into variations at the small scale (e.g., any shape can be digitised or spectrally decomposed with a Fourier transform), the mosaic picture is not, in itself, an example of emergence. Emergence refers to properties and behaviours, not patterns. The main indication for emergence is the appearance of properties and behaviours that could not even be conceived of using lower-level descriptions. For that reason, an emergent-level “effective theory” (the one that explains phenomena at that level) is needed (Ellis, 2020), in which case the properties - and the levels attributed to them - are considered irreducible. Biological systems present to us as a hierarchy of irreducible levels, as Polanyi, (1968) explained.
In Farnsworth (2022), I gave an account of how properties can emerge as irreducible phenomena of larger scale patterns (see blogpost). The idea is explained and illustrated with the example of the ATP synthase complex (that marvellous little dynamo sitting in mitochondrial membranes). I also showed mathematically why true emergence and irreducibility are inevitable consequences of certain kinds of patterns of patterns. The most clearly qualifying arrangements are those that we only find in living systems. In more formal language, the properties of a biological systems do not supervene on the properties of their component parts. This is a contradiction of Kim’s famous refutation of emergence. The crucial point was that patterns among real physical objects (such as atoms) are not merely geometric, they have associated physical force-fields and thereby exert efficient causes. As the ATP synthase example illustrates, efficient causes, properly attributed to the higher level L, can exert their effect on their lower level (L-n) components. That of course is the same as downward causation.
Emergence and downward causation
In general, we can attribute to any assembly (A) at level L: A(L), a set of potential efficient causes G(L), i.e. those possible effects that the assembly's force field may have on any other assembly. Whenever A(L) interacts with any other assembly B(M) of level M, the set of efficient causes that can actually occur is selected from the mutual interaction of their forcefield patterns (since efficient cause is a relational phenomenon). However, there is in this no restriction that M must be equal to L. If it is equal, we would call the interaction between the two assemblies `same-level' causation; if M>L, we would say upward causation and if M<L, it would be downward causation.
The development of a multicellular organism from its embryo is a good biological example.
The Turing mechanism for pattern formation (important in morphogenesis and body-plan establishment) represents symmetry breaking at an emergent higher level by self-organisation of cells. What emerges from it is the higher level pattern, but this is not an emergent property in the sense used above unless it informs efficient causation. In embryonic development, cells are predisposed to differentiate into one of several possible cell-types, the destination being determined by gene regulation that is controlled by extracellular signals generated by neighbouring cells. The Turing mechanics of these chemical signals, operating at the multi-cellular level, creates higher level pattern in their concentration that results in the emerging body-plan of the organism (Landge, 2020). In this case the higher level pattern is clearly causal since it stimulates the cell differentiation and that is the emergent property in this case. The causation that is informed by the higher level pattern of chemical signals is acting upon the lower level systems (cells) from which it derives, jointly with their relative locations in space: it is a case of downward causation.
What is really happening in downward causation is that the forcefield generated by a part (e.g., constituent molecule) interacts with the overall pattern of the combined forcefield of the whole (including its own forcefield) in a way that produces an energy minimising pattern for the part (within the whole) that is different from what it would be if the part were isolated. The mechanism of ATP synthase illustrates this perfectly
Structure and Identity
By structure, we particularly mean the way the components are connected together with causal relationships (links). More
specifically a structure is an ordered set of material components in
which the order (placement of each in relation to the others) creates a
whole (see 'The Ma of Ecology' for further explanation). But what then, is a whole?
One answer that is very strong is the idea of a 'Kantian whole' -
named after the philosopher Immanuel Kant by Stuart Kauffman to
represent a system in which "all the parts exist for and as a
consequence of the whole". In other words a system with closure to efficient causation,
as Rosen and followers, Hoffmeyr and others would describe a system,
the components of which are made by the system, which in turn exists
because of the functioning of those same components it made (see Circular Causation
for further explanation of this). In this definition, a whole and
therefore a structure (which may or may not be complex) is its own
cause (see Causal Closure).
Such a system has an identity and functions and with those come the
teleological (meaningful, purposful) attributes that we can only
legitimately attach to living things.
Summary
This theme contains the central application of our thinking: to
explain how life as a general phenomenon, independent of the scale we
look at it, is a kind of information processing. In recognising that
life’s information is embodied as functional complexity and using
information theory to understand how this naturally builds a hierarchy
of functional levels (see here) in which each of life’s defining phenomena play
out, we emphasise the unity over scales and through time. The rules
generating complexity apply continuously from atoms to whole ecosystems
and integrate life with the wider universe of information dynamics.
In
a fundamental sense, found through an information-theory perspective,
life as a whole is seen to be a single process, much elaborated by its
inherent complexity. But complexity is not a vague description of
diversity and intricacy: it has a formal and functional definition
which supports a deep and robust theory of life and its place in the
wider universe.
The
core idea here is that:
1) embodied information constrains the action
of physical forces among ensembles of interacting components (for
example a lot of atoms or biological cells) ;
2) the constraint of
forces make efficient causes;
3) when causes are organised (by information embodied by the structure of an ensemble) into functional sets of interactions, the ensemble can become a complex system;
4) a set of complex systems can act as the components of higher level complex systems, creating a hierarchy;
5) because all that is determined by information embodied in a nested
hierarchy of organisational levels, we should be able to quantify
biocomplexity and its functions in terms of information.
This theme gathers efforts to do exactly that.
The physical foundations of complexity
There are four physical forces: strong and weak nuclear,
electromagnetic and gravity (which is not quite the same, being a
phenomenon of space-time). Still, the only forces of much biological
significance are the electromagnetic. These forces act upon, and
eminate from, elementary particles, atoms (just not so much with the
noble elements) and ions. Without constraint they operate in all
directions (A), but if constrained by a spatial configuration such as
the regular lattice (B), they act coherently and so become effective at
the macroscopic scale. The confifuration is a limitation of where the
particals are placed - very precisely in the crystal latice shown.
Constraint on the placement of items is equivalent to embodying
information in the configuration of those items, so (B) embodied
information. Crystals are entirely inert and simple - not complex
systems. But we do not have stop there. In (C) atoms of several kinds
are arrayed in two different molecules which have an electron cloud
surface shape in which one of them fits rather well to the other. The
atoms at the surface also complement one another's left-over attractive
forces and so the two match and join together. The information
constraint of each causes the electrical forces of attraction to do
some work and information embodied in one effectively recognises the
information embodied in the other and they unite. This is exactly what
goes on when a signal molecule, such as a hormone is recognised by its
receptor molecule. Not only that, but the configuration of the receptor
might spontaneously change when the connection is made (because of the
change in electrical charge distribution the joining causes). This
change can be arranged so that the receptor then releases or attracts
another kind of molecule and we have a signalling system. Or it may
change conformation (shape) to open a hole within it and let
small molecules through (as in the gated ion channel illustrated
in (D)). Either way, the matching of patterns which are embodied
information results in a functional unit - something that can perform a
useful function in the wider context of a system.
The idea that information is essential to life is familiar, but has
been largely confined to the molecular scale (considered in depth by
the Molecular Biology
theme. Here we extend the concept that life is an information
phenomenon to apply at every level of organisation, from molecules to
the global ecological system. Our synthesis
arrives at the conclusion that living is information processing: the
transformation of information by logical operations, together with its
transmission (in communications and reproduction) and storage. Memory
is maintained by both molecular states and ecological states as well as
the more obvious nucleic acids; more generally, information is stored
by life by embodying it in structure at multiple scales of
organisation, from the shape of biomolecules to the networks of
interaction among the populations of an ecosystem. the two main means
life uses to process information are filtration (as in cognition) which
selects by context and synthesis, especially combining information at
lower levels of organisation to appear at higher levels in complex
systems (emergence). This information processing has one overall
function: it is to perpetuate itself as that is the ultimate function
of life.
Life’s information is instantiated as pattern in form embodying living
structures, such as molecular and cellular structures. The
corresponding pieces of information are combined by the creation
of mutual context among forms: one form ‘means’ something to another
such that a process may take place when they encounter one another (for
example when a hormone meats its receptor). This context results in
apparently new information, but it is not in fact new, it is ‘revealed’
by the process as an emergent property of the system. This constructive
process forms arbitrarily large complexes of information, the combined
effects of which include the functions of life.
In terms of a computer analogy, life is both the data and the program
and its biochemical structure is the way the information is embodied. A
cell can be seen as a set of algorithms running on biochemistry; an
organism as a set of algorithms running on a community of cells and an
ecosystem as a set of algorithms running on a community of organisms.
This idea supports the seamless integration of life at all scales with
the physical universe.
References
Albantakis, L. (2021). Quantifying the Autonomy of Structurally Diverse Automata: A Comparison of Candidate Measures. Entropy 23,1415. https://doi.org/10.3390/ e23111415
Bertschinger, N.; Olbrich, E.; ay, N.; Jost, J. (2006). Information and closure in systems theory. German Workshop on Artificial Life <7, Jena, July 26 - 28, 2006>: Explorations in the complexity of possible life, 9-19.
Ellis, G.R.F. (2020). The Causal Closure of Physics in Real World Contexts. Foundations of Physics. 50: 1057-1097.
Farnsworth, K.D. (2022). How an information perspective helps overcome the challenge of biology to physics. Biosystems. 217: 104683. https://doi.org/10.1016/j.biosystems.2022.104683
Haylighen, F. (1996). What is complexity? Principia Cybernetica Web. pespmc1.vub.ac.be/COMPLEXI.html
Ladyman, J., Lambert, J., Wiesner, K. (2013). What is a complex system? European Journal for Philosophy of Science. 3, 33-67.
Ladyman, J., Wiesner, K. (2020). What is a complex system? Yale University Press.
Landge, A. N., Jordan, B. M., Diego, X. and Mueller, P. (2020). Pattern formation mechanisms of self-organizing reaction-diffusion systems. Development Biology. 460: 2-11. doi:10.1016/j.ydbio.2019.10.031
Hofmayr, J-H. S. (2021). A biochemically-realisable relational model of the self manufacturing cell. (2021). Biosystems. 207:104463. doi:10.1016/j.biosystems.2021.104463
Oizumi, M.; Albantakis, L.; Tononi, G. (2014). From the Phenomenology to the Mechanisms of Consciousness: Integrated Information Theory 3.0. PLoS Comput. Biol. 10, e1003588.
Kauffman, S. A. and Clayton, P. (2006). On emergence, agency and organisation. – Phil. Biol. 21: 501–521.
Polanyi, M. (1968). Life's irreducible structure. Science. 160: 1308-1312.
This Theme seeks to:
- Define complexity in the biological context and show how it can be quantified;
- Develop a theory of how complexity at each level of organisation produces the phenomena of the next level, in terms that are independent of organisational level (see this page for new insights);
- Develop this into an integrated information-based
understanding of the complexity of life.
The Theme is led by
Dr Keith Farnsworth