The most highly constrained form of collaboration is typified by the field of ``genetic algorithms'' (GA) [3,1]. Generally, genetic algorithms use bit strings to encode the solutions to some engineering problem. By mutating and crossing over bit strings in a large population, repeatedly evaluating the ``fitness'' of the solutions, and preferentially replicating the most fit solutions, the genetic algorithm can search for optimal solutions.
During the design of the GA, the manner in which the bit string encodes the solution space is determined. Normally this involves a fixed length bit string, with successive portions of the string assigned to the representation of predetermined quantities, such as the coefficients in an equation. Thus the form of the solution is determined in advance, and does not take part in the evolutionary process.
Although researchers in the GA field freely use the phrase ``natural selection'' to describe their evolutionary process, the GA absolutely does not use natural selection. It uses artificial selection. The designer of the GA writes a ``fitness function'' algorithm, which determines which members of the population of bit strings will be favored through replication.
It is also worth noting that the strings in a GA do not self-replicate. They are copied by the simulation system, after evaluation by the fitness function. They may be copied precisely, or with some ``mutations'' in the form of bit flips, or in many cases, only a portion of the bit string is copied, in combination with a complementary portion from another favored bit string.
The GA represents one extreme in the spectrum of control: total control. The GA software totally controls: the form of the solution, the fitness function, the nature of the genetic operators, and the method of replication. This approach makes minimal use of evolution's creative potential.