The « Individual » in the light of information theories

Reading time: 8 minutes

Translation by AB – January 20, 2021


Last month an article was released in the online magazine Quanta Magazine and entitled « What Is an Individual? Biology Seeks Clues in Information Theory »1. If the theory of information governs many scientific syntheses, it seems to us that the Individual living being studied by biology should nevertheless “resist”2. So, will the “Individual” (in the broad sense) also yield? David Krakauer, evolutionary theorist and president of the Santa Fe Institute, attempts a seductive approach that nevertheless presents some of the epistemological flaws typical of information science. We will easily recognize them and remember that the Individual, especially the living one, still has some specificities.

Biology

Science identifies its objects with varying degrees of success. Thus, biology deals with living things. But in the course of their analytical work, biologists observe that there are no living beings in the sense of individuated Individuals but rather embedments, interlocking and coupling of persisting “phenomena”, and which one can call, in first analysis, “Individuals”. Thus:

Worker ants and bees can be nonreproductive members of social-colony “superorganisms.” Lichens are symbiotic composites of fungi and algae or cyanobacteria. Even humans contain at least as many bacterial cells as “self” cells, the microbes in our gut inextricably linked with our development, physiology and survival. These organisms are “so intimately connected sometimes that it’s unclear whether you should talk about one or two or many,” said John Dupré, a philosopher of science at the University of Exeter and director of Egenis, the Center for the Study of Life Sciences.

Biology has therefore not yet really resolved the question of its objects3, which cannot be easily identified. This empirical field is now (again) in search of a new conceptual underpinning. Melanie Mitchell, professor of computer science at the Santa Fe Institute, suggests this starting point:

In a way, [biology] is a science of individuality.

Certainly, but isn’t it disturbing that this a priori attractive reformulation is proposed by an information theorist rather than by a biologist?

Information

David Krakauer thus introduces his work (emphasis added)4 :

We want to allow for the possibility that microbes and loosely bound ecological assemblages such as microbial mats and cultural and technological systems, when viewed with a mathematical lens, qualify as individuals even though their boundaries are more fluid than the organisms we typically allow.

It is no longer a question of delineating empirically an Individual by its borders (skin, cell membrane, etc.) but more generally of characterizing it as a universal phenomenon continually detached from a background that constitutes its environment. This “de-essentialization” in a way extends the ideas of Francisco Varela (Francisco Varela, the Heterodox) or Gilbert Simondon (Gilbert Simondon, “philosopher of information?”). At the root of this principle, there is this intuition, now well supported by observation, of a dynamic co-determination of organisms and their environments, in other words of their “coupling”. The environment “of” the microbe and the environment “of” human are not equivalent to the same Outer World in which they would be immersed. We should rather speak of this environment “which makes” the microbe, or this other one “which makes” the human.

Krakauer and his colleagues embrace this principle but, by interpreting it in the light of mathematical theories of information, “sterilizes” the subject, so to speak. This conceptual scheme is reminiscent of the work of Giulio Tononi on conscience (About Artificial Consciousness) or Stuart Russell in the field of ethics (Being Stuart Russell – The comeback of Moral Philosophy). These apparently innocuous epistemological choices are not without consequences for all of us, because the widespread application of mathematical tools and principles of information science, in this case to the question of individuality, always diffuses the same “toxins” (provisional term) in our belief system.

Toxins

The “toxin of the mathematical continuum“, which we are going to insist a little on here, consists in making any concept-object measurable on a continuous scale, along a graduated ruler. Thus, for Giulio Tononi, any system can be more or less conscious (the famous function Φ), as according to David Krakauer any system / process, including “cultural or technological“, can be more or less individuated:

Individuality can be continuous, with the possible surprising result that some processes possess greater individuality than others.

Nothing prevents a priori access to a certain degree of individuality, obviously “empirical” living beings, but also social organizations (anthills, termite mounds, cities …), cultural and political systems (democracies, authoritarian systems… ), named processes (why not the covid-19 epidemic…) or even, of course, technological systems (social networks, bots…). This principle is not scientific in the sense that it proceeds a priori, in a synthetic manner, without resulting from a prior analysis of the processes and systems to which it claims to apply. In particular, it is therefore not refutable and cannot lead to any new prediction. To be more or less individuated is simply to be more or less “high” on the Krakauer scale. We could thus “commensurate” the degree of human individuality and the covid-19 epidemic…

The “continuum toxin” therefore works as follows: take any problematic concept that one seeks to control, for example, according to Stuart Russell, that of “human value” (Being Stuart Russell – The comeback of Moral Philosophy). Figure out a mathematical formula that measures the “degree” of this concept. This presupposes a) a set of basic axioms or beliefs and b) instruments for measuring the concept. In the case of “human values”, Stuart Russell explains that a) these human values are integrally identifiable in our behavior, b) our behavior is measurable by the data that we now leave in mass each time we act (the “formula” could be in this case calculated by a neuromimetic algorithm). Last step: the measurement turns into an absolute criterion of the concept. In other words, if the algorithm sets that such a given action is more or less in conformity with human values, then that action really has this degree of conformity with human values. This subject concerns us all because States and powerful private digital organizations use this kind of work to commensurate our actions, even our being: we are more or less in accordance with national ethics (social credit rating in China) or with the private ethics (social network ratings), more or less conscious and now, thanks to Krakauer, more or less individuated.

Finally, let us note that the “continuum toxin” has been working on social networks since the early 2000s and consolidates for example in our belief system a “gender fluidity” (which is in fact a contiguity of classes made watertight), no longer only sexual but now universal, as well as the legitimacy of everything to access any concept (the “consciousness” of plants for example). The “living”, the “conscious”, the “Individual”, the “ethics” are degrees that can be applied to any type of creature, system, or process, whether natural or artificial, and in particular virtual. We have nothing in principle against these doctrines, but, from a political point of view, their application is only possible in a mathematized society supported by means of measurement and control.

The other two “toxins” are better known here, and we won’t overemphasize them.

The “informational agent toxin” consists in considering any object as being a “Floridi-style” information agent (From the infosphere to a “gaseous” ethics). Krakauer thus proposes, before delving into the calculations, that “individuals are aggregates that “propagate” information from the past to the future and have temporal integrity“. All the old mathematical paraphernalia of information theory thus falls to hand, presaging nothing really new.

Finally, the “data toxin” consists in confusing, voluntarily or not, data, which are fictitious mathematical transformations, with reality (in French: Données et traces numériques (sous rature)). This toxin invades hyper-digitized scientific fields, such as biology and neuroscience. Data is not the “New Black Gold” as has sometimes been suggested, but the New Reality or, to play with words a little, the New “Essence”. It is therefore, in principle, a considerable source of ecological disorders.

Back to the Living

Quanta Magazine at the same time relayed some criticisms of Krakauer’s work, notably those of Maxwell Ramstead, researcher in the Department of Psychology at McGill University. According to Ramstead, the fact that Krakauer’s work can be applied to any type of system, natural or artificial, does not work in their favor. Indeed, biology is ultimately not served by the promise of a conceptual basis specific to its objects. If Ramstead approves the informational premises of this work (decidedly…), according to him, it lacks the ingredients allowing to distinguish biological entities (living things) from other organized systems (hurricanes, epidemics…). As the Tononi function does measure “something”, Krakauer’s mathematical developments do measure “something” that may be interesting, but which is only a mathematical object foreign (“orthogonal”) to the Living. Ramstead points out that it lacks in particular a reflection on how an Individual maintains the border which delimits him:

“Organisms aren’t just individuated,” he said. “They have access to information about their individuation” […] “It’s not clear to me that the organism could use these information metrics that they define in a way that would allow it to preserve its existence”.

We intuitively agree with Ramstead’s idea that an organism (a self-organizing system) constantly makes predictions about its environment and seeks to minimize errors:

For organisms, that means in part that they are constantly measuring their sensory and perceptual experiences against their expectations. “You can literally interpret the body of an organism as a guess about the structure of the environment”.

In this way, the organism defines itself as an individual who maintains separation from its environment. On the other hand, in these interesting remarks by Ramstead, traces of the “information agent toxin” remain, this agent which exchanges information with its environment, information which must be processed in a “computing machine”.

Back to Language

Let us leave this last word to Melanie Mitchell:

“It’s a different way of thinking about individuals,” said Mitchell […] “As kind of a verb, instead of a noun”.

Absolutely! An individual is a process rather than a completed being. It is, said Gilbert Simondon, “the individuation which carries the ontological load” and we had even proposed a generalization of this principle to the entire Language, in this “note kept for later” (Body and Language Games):

Moreover, no concept should essentialize (like “Will”, “Infinity”, “Data” …) but should always be the name of an ongoing process in a language game (the “will to”, the “infinitization”, the “reflection” …).

It remains to specify the meaning of the modality “should“…


1. Jordana Cepelewicz / Quanta Magazine – July 16, 2020 – What Is an Individual? Biology Seeks Clues in Information Theory.
2. See Miguel Benasayag and the question of the “living”.
3. It should be noted in passing that neurosciences have the same difficulty with the neuronal floor, the neuron ultimately seeming far from being individuated, even if it was on first analysis (in French: Recomprendre le neuromimétisme).
4. David Krakauer, Nils Bertschinger, Eckehard Olbrich, Jessica C. Flack & Nihat Ay / Theory in Biosciences 139, 209-223(2020) – March 24, 2020 – The information theory of individuality

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.