John Johnston - The Allure of Machinic Life: Cybernetics, Artificial Life, and the New AI - 2008
History /
Edit /
PDF /
EPUB /
BIB /
Created: October 12, 2017 / Updated: November 2, 2024 / Status: in progress / 13 min read (~2435 words)
Created: October 12, 2017 / Updated: November 2, 2024 / Status: in progress / 13 min read (~2435 words)
- Is it possible to create some sort of hierarchy or organization of all life by looking at DNA as a form of code shared by all organisms?
- What is the smallest organism that has DNA?
- What comes after machines?
- In strong theories of ALife these machines are understood not simply to simulate life but to realize it, by instantiating and actualizing its fundamental principles in another medium or material substrate
- These forms of machinic life are characterized not by an exact imitation of natural life but by complexity of behavior
- Thinking in terms of the complexity of automata, whether natural or artificial, rather than in terms of a natural biological hierarchy is part of the legacy of cybernetics
- While contemporary biologists have reached no consensus on a definition of life, there is wide agreement that two basic processes are involved: some kind of metabolism by which energy is extracted from the environment, and reproduction with a hereditary mechanism that will evolve adaptations for survival
- Theoretical biologist Stuart Kauffman has suggested that thinking of the development of an organism as a program consisting of serial algorithms is limiting and that a "better image of the genetic program - as a parallel distributed regulatory network - leads to a more useful theory"
- The genetic program works by means of a parallel and highly distributed rather than serial and centrally controlled computational mechanism
- This echoes the observation made by Christopher Langton that computation in nature is accomplished by large numbers of simple processors that are only locally connected
- A technical system forms when a technical evolution stabilizes around a point of equilibrium concretized by a particular technology
- A computational assemblage comprises a material computational device set up or programmed to process information in specific ways together with a specific discourse that explains and evaluates its function, purpose, and significance
- Computational assemblages give rise to new ways of thinking about the relationship between physical processes (most importantly, life processes) and computation, or information processing
- Every isolated determinate dynamic system obeying unchanging laws will develop "organisms" that are adapted to their "environment"
- Wolfram presented a seminal demonstration of how the dynamic behavior of cellular automata falls into four distinct universality classes:
- One that halts after a reasonable number of computations
- one that falls into a repetitive loop or periodic cycle
- One that generates a chaotic, random mess
- One (the most complex) that produces persistent patterns that interact across the local spaces of the grid
- Our human capacity as toolmakers (homo faber) has made us the vehicle and means of realization for new forms of machinic life
- This strand of thinking has given rise to two conflicting cultural narratives, the adversarial and the symbiotic
- Adversarial: Human beings will completely lose control of the technical system, as silicon life in the form of computing machines performs what Hans Moravec calls a "genetic take-over" from carbon life
- Symbiotic: Human beings will gradually merge with the technical system that defines and shapes the environment in a transformative symbiosis that will bring about and characterize the advent of the posthuman
- The real questions are how global properties and behaviors emerge in a system from the interactions of computational "primitives" that behave according to simple rules and how these systems are enchained in dynamic hierarchies that allow complexity to build on complexity
- One significant current in ALife research asserts that complexity (or complex adaptive systems) rather than "life" (and thus the opposition to nonlife) is the conceptually more fruitful framework
- Lamarck's theory: acquired traits are passed down to subsequent generations through hereditary mechanisms
- Artifacts with similar purposes may be designed to very different specifications and chosen for very different reasons
- Herbert Spencer's concept of evolution: evolution is a process giving rise to increasing differentiation (specialization of functions) and integration (mutual inter-dependence and coordination of function of the structurally differentiated parts)
- Computationalism: all physical processes can be viewed or understood as computations
- One widely accepted example is the view that evolution itself is simply a vast computational process, a slow but never-ending search through a constantly changing "real" fitness landscape for better-adapted forms
- The living organism: a heat engine, burning glucose or glycogen or starch, fat, and proteins into carbon dioxide, water and urea
- The organism's body is very far from a conservative system, and that its component parts work in an environment where the available power is much less limited than we have taken it to be
- Whereas for Shannon information measures uncertainty, or entropy, for Wiener it measures a gain in certainty; information, therefore, he considered to be a measure of negative entropy, or "negentropy"
- Ashby's approach to machines: a machine is that which behaves in a machinelike way, namely, that its internal state, and the state of its surroundings, defines uniquely the next it will go to
- How can the differences that underlie the logic of organization in biological as opposed to artificial entities be used to build more reliable machines? More specifically, how can unreliable components be organized to become highly reliable for a machine or automaton as a whole? What are the conditions that would enable simple automata - understood as information-processing machines that exhibit self-regulation in interaction with the environment - to produce more complex automata?
- When we talk mathematics, we may be discussing a secondary language, built on the primary language truly used by the nervous system
- For simple automata, it is easier to describe the behavior itself than exactly how this behavior is produced or effectuated
- Above a certain threshold of complexity the description of the structure would be simpler than a description of the behavior
- First, the logic of automata would have to be continuous rather than discrete, analytical rather than combinatorial
- Second, it would have to be a "probabilistic logic which would handle component malfunction as an essential and integral part of automata operation"
- Third, it would most likely have to draw on the resources of thermodynamics and information theory
- By his own calculations, the neurons in the brain are some 5,000 times slower than the vacuum tubes used as switching devices in the first electronic calculators, yet they are far more reliable. This is simply because they are far more numerous, and their connections more complicated
- This flexibility (having a high degree of error tolerance), von Neumann speculates, probably requires an "ability of the automaton to watch itself and reorganize itself"
- Turning to the problem of error, von Neumann introduces the idea of "multiplexing," that is, of carrying a single message simultaneously on multiple lines, and demonstrates statistically that by using large bundles of lines any degree of reliability for a circuit can be insured
- When changes in the environment occur, an organism must adapt itself to the new conditions in order to survive
- In Ashby's approach, the environment - designated as E - is also a transducer, or operator, in the sense that "it converts whatever action comes from the organism into some effect that goes back to the organism"
- The brain of the organism must therefore act as an inverse operator $E^{-1}$ capable of reacting in such a way that the environmental disturbance is followed by an action that returns the organism to the proper values of its own variables
- Ashby explains that the homeostat "is really a machine within a machine"
- This is necessary because it must deal with two kinds of variables
- The continuously fluctuating type: the machine responds with small corrective movements
- When it is unable to restore itself, it changes from one set of feedbacks, which it has found to be unstable, to another set
- This is necessary because it must deal with two kinds of variables
- The fact that the machine's capacity to model a particular natural process does not exhaust its interest
- With no evident or agreed upon understanding of how the process of learning works and what it entails, the group is not yet equipped to assess the machine's value in these terms
- Viewed in terms of what it does, the homeostat is simply a machine that adapts to changing environmental conditions by repeatedly changing and testing its own design until it reaches a state of equilibrium
- It is not so much the "multiplicity of units [that] is ... responsible for the elaboration of cerebral functions as the richness of their interconnections
- Extreme plasticity cannot be gained without some loss of stability
- The more learning circuits or paths of association, the more unstable the system as a whole
- Connecting simple elements in multiple ways generates complexity
- In Grey Walter's model of the brain agency is fully embodied in a material set of parts and connections
- A system that spontaneously - that is, without external guidance or control - moves from a random, or less unorganized, state to one that exhibits a more orderly pattern of behavior
- Doubts arise from the apparent contradiction: how can the system be both "a strictly determinate physico-chemical system and ... undergo 'self-induced' internal reorganizations resulting in changes of behavior"
- The system or machine will have to contain two distinct organizations, "each of which is absolute (i.e., completely determined) if considered by itself". What connects them is a single step-function of time with two values
- During a first period of time the system has one organization, and during a second it has another
- It is no longer a Newtonian machine but "lives" in Bergsonian time
- Two meanings of the term "self-organizing system":
- A system can be said to be self-organizing if it encompasses parts that are separate and independent and that then join
- The system is changing from a bad organization to a good one
- A system would be self-organizing if it takes a flat, even distribution of states into a peaked, non-uniform one
- In other words, the entropy of a self-organizing system would have to decrease
- How the boundary between the system and the environment is to be defined and located?
- At any instant of time as being the envelope of that region in space which shows the desired increase in order
- How should order be measured?
- Use Claude Shannon's definition of "redundancy" in a communication system $R = 1 - H/H_m$
- $R$ is the measure of redundancy and $H/H_m$ the ratio of the entropy $H$ of an information source to its maximum value $H_m$
- Use Claude Shannon's definition of "redundancy" in a communication system $R = 1 - H/H_m$
- Schrödinger remarks that there are two mechanisms that produce order:
- a statistical mechanism producing "order from disorder"
- a less familiar mechanism that produces "order from order"
- While it is true that a random or meaningless concatenation of symbols still contains a measurable amount of information, information acquires its status and value as information only because there is an assumed correlation between a message composed from a set of discrete symbols and physical events and processes in the world (i.e., the symbols in themselves are not meaningless)
- In Braitenberg's presentation two basic ideas come into play
- The first is what he calls "the law of uphill analysis and downhill invention"
- Essentially this means that building vehicles that work and do things - especially things that are unplanned - is usually easier than analyzing from external observation the internal structures that make this behavior possible
- This is because, as Braitenberg explains, induction is slow and requires a search for solutions
- Above a certain number of active elements and cross-connections in his vehicle's brain, its behavior becomes unpredictable to a human observer, even though that behavior is completely determined
- Mimicking of the process of reproduction, copy errors, and selection we recognize as Darwinian evolution
- The first is what he calls "the law of uphill analysis and downhill invention"
- Lacan asserts in his lecture "Psychoanalysis and Cybernetics," cybernetics was a new kind of "conjectural science" that for the first time made it possible to understand the autonomy of symbolic processes
- "The machine is much freer than the animal," which is really a "jammed machine," where "certain parameters are no longer capable of variation"
- "It is inasmuch as, compared to the animal, we are machines, that is to say something decomposed, that we possess greater freedom, in the sense in which freedom means the multiplicity of possible choices"
- Turing proposed that if the problem can be expressed as an algorithm, or a precise set of formal instructions for arriving at the specified solution, then it can be computed mechanically by a machine
- Three differential orders, or registers, of experience that Lacan calls the symbolic, the imaginary, and the real
- Lacan associates the ego with the imaginary order
- Dynamic systems are systems whose state changes as a function of time; their behavior is usually described by differential equations that can be solved by analytic methods
- Nonlinear dynamical systems, on the other hand, have exponential or other functions that make them intractable, or nearly so; parts of these systems interact in ways that produce disproportionate and strange effects
- Two strands of chaos science:
- The emergence of order out of chaos
- The application of nonlinear mathematics to diverse phenomena like weather prediction and turbulent flow, population growth, and the rise and fall of market prices
- While all physical processes are determined by the laws of physics, once an element is introduced that has the capacity to direct, alter, or constrain these processes, the laws of physics alone are no longer adequate to account for the outcome
- On the one hand, what is unique about every specific form of biological organization is symbolically encoded in its genetic information
- On the other hand, while this genetic information "spells out" the organism's identity, it is not fully given in the organism's DNA
- Some four billion years ago a molecule that could copy itself emerged by accident from a primeval chemical soup. This was the first replicator, and soon the soup would have been filled with identical replicating molecules, each producing more copies of itself
- Copying errors would inevitably have led to variations and then competition among the variant replicators for the necessary building blocks
- The replicators that survived were the ones that built survival machines for themselves to live in, and over time these survival machines only got bigger and more elaborate
- The genetic code is the collective assemblage of enunciation; the physical body, its machinic assemblage
- Johnston, John. "The Allure of Machinic Life: Cybernetics." Artificial Life and the New AI. Cambridge, Mss & London, The MIT Press (2008).