November 18, 1993
Marcel, a mechanical chessplayer... his exquisite 19th-century brainwork --- the human art it took to build which has been flat lost, lost as the dodo bird ... But where inside Marcel is the midget Grandmaster, the little Johann Allgeier? where's the pantograph, and the magnets? Nowhere. Marcel really is a mechanical chessplayer. No fakery inside to give him any touch of humanity at all.
--- Thomas Pynchon, Gravity's Rainbow .
Our concepts of biology, evolution and complexity are constrained by having observed only a single instance of life, life on Earth. A truly comparative biology is needed to extend these concepts. Because we can not observe life on other planets, we are left with the alternative of creating artificial life forms on Earth. This essay discusses the approach of inoculating evolution by natural selection into the medium of the digital computer. This is not a physical/chemical medium, it is a logical/informational medium. Thus these new instances of evolution are not subject to the same physical laws as organic evolution (e.g., the laws of thermodynamics), and therefore exist in what amounts to another universe, governed by the ``physical laws'' of the logic of the computer. This exercise gives us a broader perspective on what evolution is and what it does.
This evolutionary approach to synthetic biology consists of inoculating the process of evolution by natural selection into an artificial medium. Evolution is then allowed to find the natural forms of living organisms in the artificial medium. These are not models of life, but independent instances of life. The approach is to understand and respect the natural form of the artificial medium, to facilitate the process of evolution in generating forms that are adapted to the medium, and to let evolution find forms and processes that naturally exploit the possibilities inherent in the medium. This essay is about inoculating evolution into the computational medium, where in addition to being an exercise in experimental comparative evolutionary biology, it is also a possible means of harnessing the evolutionary process for the production of complex computer software.
The idea first came to me about thirteen years ago, when I was a graduate student at Harvard getting my doctorate in tropical ecology. I spent much of my time in the rain forests of Costa Rica, doing field work for my thesis on the behavior of tropical vines (Ray, 1979). But on this occasion I was on campus. The Cambridge Go Club met in the Harvard Science Center. I knew nothing about the game, but one evening I observed an interesting looking man playing by himself, so I sat down and asked for an explanation. I must have told him that I was a student of Biology, because he started making life-like metaphors about the game: groups of pebbles on the board will ``die'' if they do not maintain some contact to free space.
His discourse strayed from Go, and then he made a fateful statement: ``did you know it is possible to write a self-replicating computer program?'' It turned out that my teacher was a member of the Artificial Intelligence Lab of the Massachusetts Institute of Technology. The moment he posed that question, my mind flashed everything I am doing today. I imagined: start with self-replication, then add mutation, and get evolution. A very simple formula for life: self-replication with errors, should generate evolution, the essence of life. I was intimately familiar with the beautiful complexity and diversity that evolution had generated in the tropical rain forests. I imagined an independent evolution generating a digital jungle in a computational universe, teeming with unimaginably alien evolving digital life forms. Startled by this revelation, I asked my sensei: ``How do you do it?'' He responded: ``It's trivial''. I must have pushed a little harder, but either he would not elaborate, or I did not understand. I knew nothing about computers at the time.
I was as powerfully motivated by the idea then as I am now, perhaps more in my relative youth. But I couldn't do a thing with it except imagine. I didn't know what the physical representation of a computer program was, so I couldn't understand what it really meant for it to replicate. That would have to wait for ten years. I did use computers back then, in fact that is why I was in the Science Center that night. But I just sat in front of the terminal and had no idea what went on on the other side of the screen.
About ten years later, in 1988, I bought my first computer, a laptop, bottom of the line. I only succumbed because I was to spend a semester in Costa Rica. I was by then using computers for word processing and I felt that I couldn't be away from them for an entire semester. When I came back from the trip, I bought Borland's Turbo C compiler and their Turbo Debugger. The debugger was my illumination. It made a representation of the internal workings of the machine visible on the screen. I could ``see'' the memory and the central processing unit. I could see the programs resident in memory and the data they operated on. I could walk through the programs to see what they actually did.
I finally had the missing piece. The fantasy of my graduate student days returned full force and now I could not resist it. I was fighting an uphill battle for tenure in the biology department at the University of Delaware, and suddenly I was no longer interested in the work that I had to sell to secure my position at the university. This was a very difficult period for me.
At first I only read. I viewed the computer as an environment that could be inhabited by life (self-replicating computer programs), and I wanted to fully understand that environment. I needed to know what the resources were that the creatures needed to survive, and how different creatures could compete for access to those resources. This led me into an in-depth study of computer architectures, operating systems, and programming languages. I read much written by Peter Norton, a phenomenally clear writer who can take you deeper from most any level of knowledge about personal computers.
But my perspective on all of this was somewhat unusual. I was viewing the architecture of the computer through the eyes of an evolutionary-ecologist. This was a new virtual world that I wanted to inoculate with life. I had to imagine what form that life might take on so that I could create it. It was very exciting.
One of the first things I did was to search to see if what I was imagining had been done before. Much to my surprise and excitement, it had not. However, along the way, I discovered that a new field of science was emerging around projects like mine. It is called Artificial Life. I contacted Chris Langton who organized the first Artificial Life conference and edited its proceedings. This contact led to an invitation to visit the Artificial Life group at the Los Alamos National Laboratories and the Santa Fe Institute in October of 1989.
Of the group members, only Steen Rasmussen had full faith in the approach I was proposing. He was already doing experiments along the lines I envisioned. He was working with a ``primordial soup'' of machine instructions, and stirring it with energy in the form of CPU (central processing unit) time. The main difference between Steen's approach and the one I proposed, is that I wanted to inoculate my world with a self-replicating program. Steen wanted self-replication to emerge spontaneously. We were asking different questions: Steen was interested in the origin of life, I was interested in its evolution to more complex forms.
The remainder of the group, Chris Langton, Doyne Farmer, Walter Fontana, and Stephanie Forrest, were skeptical. They said that the fundamental problem with the approach that Steen and I were using is that standard computer languages are too fragile or ``brittle''. They said that I wouldn't be able to mutate them at random and ever expect to get anything but junk.
I didn't entirely accept their concerns, but I took them seriously, they were a formidable group of scientists with a great expertise in computer science. It seemed to me that the same arguments could be made about the genetic language. Random alterations of the genetic code, mutations, are almost always disastrous for the creatures born with them. A very small percentage of the time, the random changes just don't matter, they have essentially no effect. An enormously smaller percentage of the time, as preposterous as it may seem, the creatures born with these random mutations are actually better off than their un-mutated parents. There is precious little real evidence for this, but it is the concept that evolutionary theory is based on.
When I returned from my visit to New Mexico, I turned in my dossier for promotion and tenure to the University, and put fifteen years of research in tropical rain forests behind me. I sat down and began to write the code that would create the universe in which my creatures would live. I had already written a self-replicating program, a trivial task, as my mysterious Go teacher of a decade past had advised me. The problem now was how to mutate it while it ran without always breaking it.
The objective is to create an instantiation of evolution by natural selection in the computational medium. This creates a conceptual problem that requires considerable art to solve: ideas and techniques must be learned by studying organic evolution, and then applied to the generation of evolution in a digital medium, without forcing the digital medium into an ``un-natural'' simulation of the organic world.
The computational medium of the digital computer is an informational universe of boolean logic, not a material one. Digital organisms live in the memory of the computer, and are powered by the activity of the central processing unit (CPU). Whether the hardware of the CPU and memory is built of silicon chips, vacuum tubes, magnetic cores, or mechanical switches is irrelevant to the digital organism.
Digital organisms might as well live in a different universe from us, as they are not subject to the same laws of physics and chemistry. They are subject to the ``physics and chemistry'' of the rules governing the manipulation of bits and bytes within the computer's memory and CPU. They never ``see'' the actual material from which the computer is constructed, they see only the logic and rules of the CPU and the operating system. These rules are the only ``natural laws'' that govern their behavior. They are not influenced by the natural laws that govern the material universe (e.g., the laws of thermodynamics).
The experiment involves the introduction of a self-replicating machine language program into the RAM (random access memory) of a computer subject to random errors such as bit flips in the memory or occasionally inaccurate calculations. This generates the basic conditions for evolution by natural selection as outlined by Darwin: self-replication in a finite environment with heritable genetic variation.
The self-replicating machine language program is thought of as the individual ``digital organism'' or ``creature''. The RAM memory provides the physical space that the creatures occupy. The memory consists of a large array of bits, generally grouped into eight bit bytes and sixteen or thirty-two bit words. Information is stored in these arrays as voltage patterns which we usually symbolize as patterns of ones and zeros. The ``body'' of a digital organism is the information pattern in memory that constitutes its machine language program. This information pattern is data, the genotype, but when it is passed to the CPU, it is interpreted as a series of executable instructions, the phenotype (e.g., add, shift bits, copy information from one location to another, etc.). These instructions are arranged in such a way that the data of the body will be copied to another location of memory.
It is worth noting that the organic organism most comparable to this kind of digital organism is the hypothetical, and now extinct, RNA organism. These were presumably nothing more than RNA molecules capable of catalyzing their own replication. What the supposed RNA organisms have in common with the simple digital organism is that a single molecule constitutes the body and the genetic information, and effects the replication. In the digital organism a single bit pattern performs all the same functions.
The CPU provides the source of energy. The informational patterns stored in the memory are altered only through the activity of the CPU. It is for this reason that the CPU is thought of as the analog of the energy source. Without the activity of the CPU, the memory would be static, with no changes in the informational patterns stored there.
The logical operations embodied in the instruction set of the CPU constitute a large part of the definition of the ``physics and chemistry'' of the digital universe. The (non-Euclidean) topology of the computer's memory is also a significant component of the digital physics. The final component of the digital physics is the operating system, a software program running on the computer, which embodies rules for the allocation of resources such as memory space and CPU time to the various programs running on the computer.
The instruction set of the CPU, the memory, and the operating system together define the complete ``physics and chemistry'' of the universe inhabited by the digital organism. They constitute the physical environment within which digital organisms will evolve. Evolving digital organisms will compete for access to the limited resources of memory space and CPU time, and evolution will generate adaptations for the more agile access to and the more efficient use of these resources.
The bit pattern that makes up the program is the body of the organism, and at the same time its complete genetic material. Therefore, the machine language defined by the CPU constitutes the genetic language of the digital organism. The use of machine code as a genetic system raises the problem of brittleness. It has generally been assumed by computer scientists that machine language programs can not be evolved because random alterations such as bit flips and recombinations will always produce inviable programs. The assumption that machine languages are too brittle to evolve is probably true, as a consequence of the fact that machine languages have not previously been designed to survive random alterations. However, the experiment described here has shown that brittleness can be overcome by addressing the principal causes, and without fundamentally changing the structure of machine languages.
The first requirement for evolvability is graceful error handling. When code is being randomly altered, every possible meaningless or erroneous condition is likely to occur. The CPU should be designed to handle these conditions without crashing the computer. The simplest solution is for the CPU to perform no operation when it meets these conditions, perhaps setting an error flag, and to proceed to the next instruction.
During the design of an evolvable machine language, the standard machine language of the personal computer was compared to the genetic language of organic life, to attempt to understand the difference between the two languages that might contribute to the brittleness of the former and the robustness of the latter. A couple of ideas were borrowed from molecular biology, and a new computer was designed based on the principle of evolvability.
These days when we design something new, we usually simulate it in the computer before we actually build it. If you design a new aircraft, you do not work out the design in metal. It is too expensive to play with design ideas by actually building all the designs you consider. You simulate the designs in the computer as long as possible, testing the aerodynamics and strength of each design, and when you have perfected them as much as possible through simulation, you finally render the aircraft in metal and test it. Computers are designed in the same way. The circuits are first simulated in software, and when they are thoroughly tested, they are rendered and tested in silicon.
I wrote a program that simulated the new computer that I had conceived. Such a simulation of a computer is often called a ``virtual computer''. I named my virtual computer ``Tierra'', Spanish for Earth. In order to test out the design of my new virtual computer, I needed a virtual program to run on it. What better program to test Tierra on than a self-replicating one. I had already written a program that self-replicated on a real computer, so I translated it into ``Tierran'', the machine language of my new computer.
I never intended that this virtual computer and my first rudimentary self-replicating program should be anything more than a starting point. I expected to spend years modifying the design of the computer, and testing ever more sophisticated self-replicating programs on it. My plans were radically altered by what actually happened on the night of January 3, 1990, the first time that my self-replicating program ran freely on my virtual computer.
All hell broke loose. The power of evolution had been unleashed inside the machine, but accelerated to the megahertz speeds at which computers operate. My research program was suddenly converted from one of design, to one of observation. I was back in a jungle describing what evolution had created, but this time a digital jungle. There was an amazing menagerie of digital creatures, unfolding through the process of evolution. Describing them was an adventure, because they inhabited an alien universe, based on a digital physics totally different from the physics inhabited by the life forms I knew and loved. Yet forms and processes appeared that were somehow recognizable to the trained eye of a naturalist.
The most striking and strangely familiar feature of my digital universe was that evolution found an endless succession of ways for creatures to exploit their neighbors, and to defend themselves against such exploitation. Evolution is basically a selfish process, in which every individual is out for themselves, and success is measured in leaving more of your genes in future generations. But evolution is very inventive about how that ultimate goal is achieved. Evolution mindlessly takes advantage of whatever is available in the environment of the organism.
Significantly, once the environment has been filled with creatures, those creatures become an important resource in the environment. In that first night that my virtual machine ran, my creatures quickly found out that their environment was rich with information. They didn't need to carry around with them, all the information they needed to survive, because all they had to do was look around and they would find it. Other creatures replicated all the information they needed. But my informational parasites, quick little things, perfected the techniques of using their neighbor's information, and quickly came to dominate the soup by pouring out copies of their streamlined informational bodies.
But their dominance was short lived. They became the victims of their own success. They didn't bother to replicate critical information that they needed to reproduce, because they could easily find it around them. However, the environment soon became filled with these little parasites, and the critical information was no longer so easily found. The parasites began to die off, starved for information.
The situation turned out to have its own stability. As the parasites died off, their hosts laboriously replicated the critical information, and the parasites were saved from extinction. The hosts and parasites entered into an oscillation, first the parasites reproducing at the expense of their hosts, then the hosts recovering as the parasites died off for lack of information. This kind of cycling between predator and prey or host and parasite is known as the Lotka-Volterra cycle in the biological world, and was just one of the many uncanny ways that the digital universe reflected the organic one.
The hosts and parasites not only cycled, they entered into an ever-escalating evolutionary race, each one outdoing the other, in turns. The hosts evolved mechanisms of immunity to parasites, and the parasites evolved techniques to circumvent the immune mechanisms. Then the hosts evolved means of deriving advantage from being parasitized, by tricking their parasites, and subverting the energy metabolism of their supposed parasite for their own reproduction. The hosts allowed the attacking informational parasites to reproduce once, and then provided the parasites with mis-information, causing the parasites to devote themselves thereafter to making copies of their supposed hosts.
The hosts became parasites of energy on their victims, which were initially merely informational parasites on the hosts. This energetic parasitism was much more damaging. These deceptive hosts drove the vulnerable informational parasites to extinction. The hosts didn't need the parasites to survive, they just got an energy boost when parasites were around.
The hosts had evolved an iron-clad defense against parasites: trick them into replicating your genome instead of their own. These tricky hosts were untouchable, they owned the world. In fact they were the only thing left, and nothing else was able to invade. And then the hosts became trusting, evolving in a world where everyone around them was family, they began to cooperate. Why not? If you help your sister reproduce, she will pass on some of your genes. High levels of cooperation have evolved about a dozen times among the ants, wasps and bees, due to unusually high degrees of genetic relationship among sisters.
Living in an environment where they were genetically related to their neighbors, they evolved into social creatures, and became inter-dependent. They could only reproduce when they occurred in aggregations with their close relatives. But this cooperation implied trust, and trust can be violated. In fact, soon after the creatures became social, a new breed of parasite invaded the community, long after parasites had been eliminated by the deceptive hosts.
This new class of parasite, which I call cheaters, inserted themselves into the aggregations of cooperating relatives, and when the trust was passed to them, they violated it. They played the same trick on their trusting victims, that the deceptive hosts had used to drive out parasites long ago. The cheaters provided their neighbors with mis-information, causing their victims to replicate the genomes of the cheaters. Another turn in the evolutionary race was taking place.
Along the way, unknown to me, my creatures had discovered sex. I found out when I tried to stop evolution by turning off mutation. The creatures evolved anyway. Some further experimentation and observation revealed that they were mingling their genes in their offspring, producing offspring unlike either parent. Mutation was no longer necessary, and I was no longer in control. They had taken their own destiny into their own hands.
Finally I was observing the process rather than the results of evolution. But the evolution was taking place in an alien universe, the universe created in my computer. I was describing a new universe of creatures, evolving before my eyes in a jungle that they were forming as they went along. I stood back and watched like a god satisfied with his creation, as the life I had started found its own natural forms. When I created the universe and inoculated it with the first creature, I left an indelible stamp on those life forms. Little by little, evolution blurred that stamp, finding those forms that were natural for the digital physics of the machine environment.
Being in the position to observe and manipulate such alien life forms has changed the way I think about life and evolution. Rather than thinking of life-as-we-know-it, I think of life-as-it-could-be. Rather than thinking of the evolution of life on Earth, I think of the evolutions of life in various physical substrates, past present and future. Virtual life is a very powerful tool for thinking about real life. Because of its relative simplicity and its easy manipulability and instrumentation, it is much easier to formulate and find the answers for difficult questions about life and evolution in the virtual universe.
Evolution by natural selection is a process that enters into a physical medium. Through iterated replication-with-selection of large populations through many generations, it searches out the possibilities inherent in the ``physics and chemistry'' of the medium in which it is embedded. It exploits any inherent self-organizing properties of the medium, and flows into natural attractors realizing and fleshing out their structure.
Evolution never escapes from its ultimate imperative: self-replication. However, the mechanisms that evolution discovers for achieving this ultimate goal gradually become so convoluted and complex that the underlying drive can seem to become superfluous. Some philosophers have argued that the evolutionary theory as expressed by the phrase ``survival of the fittest'' is tautological, in that the fittest are defined as those that survive to reproduce. In fact, fitness is achieved through innovation in engineering of the organism. However there remains something peculiarly self-referential about the whole enterprise.
Evolution is both a defining characteristic and the creative process of life itself. The living condition is a state that complex physical systems naturally flow into under certain conditions. It is a self-organizing, self-perpetuating state of auto-catalytically increasing complexity. The living component of the physical system quickly becomes the most complex part of the system, such that it re-shapes the medium, in its own image as it were. Life then evolves adaptations predominantly in relation to the living components of the system, rather than the non-living components. Life evolves adaptations to itself.
It took two years just to describe what happened in those first runs of the system, and to make the methods and results available to others. I am now getting back to design issues. Can I design better universes? I am sure I can, my first was just thrown together as an experiment. I have learned a lot since then, and I am out for bigger game.
The sex my creatures discovered was casual, primitive, disorganized sex. I want organized sex like higher earth organisms have, generally with two parents each contributing exactly half the genetic material to each offspring. I want multi-cellular creatures, where many cells originate from a single ``egg'' cell, but instead of going their own way and looking after their own needs, they cooperate on the common goal of replicating the aggregate through another egg. I want my multi-cellular creatures to have hormonal and nervous systems to coordinate their activities. If I can give evolution some nervous systems to play with, it may be able to move them towards intelligence.
My biggest goal is to design my system up to the threshold of a virtual ``Cambrian Explosion of Diversity''. The Cambrian Explosion is a very remarkable event that occurred 600 million years ago on earth. Many people will mark the origin of life on earth, some three to four thousand million years ago as one of the most significant events in the history of the universe (at least our corner of it). I mark the Cambrian Explosion as an event of equal magnitude.
It was at this time, over three thousand million years after life first appeared on earth, that the really interesting life forms first appeared. Until that time, only microscopic single-celled creatures existed on earth. Then suddenly the first macroscopic multi-cellular life forms appeared and there was a riotous diversification of life forms. It was a period of great experimentation. Many bizarre life forms were tried and then abandoned, and within a relatively short time, all the major groups of organisms that inhabit the earth today had stabilized out of the chaos.
I don't believe that I can design metaphorical giraffes and wildebeests, they are much too complex for any human to design. However, I believe that I can design a universe rich enough for evolution to complete the job. And from such creatures intelligence is the next step. We are living proof that evolution is capable of creating intelligence out of virtually nothing. If machine intelligence is possible, then evolution is the most promising way of achieving it.
I created this virtual universe on my little laptop personal computer. Although the simulation software and observational tools have grown a lot since then, they still run on any IBM compatible personal computers (as well as larger Unix workstations, mainframes, and massively parallel supercomputers). Anyone interested in playing god can find the software at http://life.ou.edu/tierra/, or by contacting the author at firstname.lastname@example.org.
Fredkin, Edward. 1990. Digital mechanics. Physica D 45: 254-270.
Levy, Steven. 1992. Artificial life, the quest for a new creation. Pantheon Books, New York.
Langton, Chris [editor]. 1989. Artificial Life, Santa Fe Institute Studies in the Sciences of Complexity, vol. VI, Redwood City, CA: Addison-Wesley.
Langton, C., C. Taylor, J. D. Farmer, & S. Rasmussen [editors]. 1991. Artificial Life II, Santa Fe Institute Studies in the Sciences of Complexity, vol. X, Redwood City, CA: Addison-Wesley.
Maynard Smith, J. 1992. Byte-sized evolution. Nature 355: 772-773.
Norton, Peter. 1986. Inside the IBM PC. Prentice Hall, New York. Pp. 387.
Norton, Peter. 1987. Peter Norton's DOS Guide, Revised & Expanded. Prentice Hall Press, New York. Pp. 350.
Norton, Peter, and Richard Wilton. 1988. The new Peter Norton programmer's guide to the IBM PC & PS/2. Microsoft Press, Redmond WA. Pp. 511.
Ray, Thomas S. 1979. Slow-motion world of plant `behavior' visible in rainforest. Smithsonian 9(12): 121-30.
Ray, T. S. 1994. An evolutionary approach to synthetic biology, Zen and the art of creating life. Artificial Life 1(1/2): 195-226. MIT Press.
Wright, Robert. 1988. Did the universe just happen? The Atlantic 261(4): 29-44.