|Home page||Services||Past achievements||Contact||Site
|Page d'accueil||Services||Réalisations précédentes||Contact|
The experiment is envisaged as consisting of:
The processor will constantly cycle, executing the contents of the memory, and doing very little. Gradually, due to the noise on the data bus being fed to the CPU for execution, the RAM will start to fill up with rubbish. After some time, through the effects of self-modifying code, an active computer program should emerge, and settle down into something stable.
Such a stable pattern should develop because it has the property of being stable; by definition, all unstable patterns should quickly disappear, through being unstable.
The most stable programs should be those that somehow managed to evolve immunity to the occasional mutation of their code. This they could achieve by various means, perhaps:
In fact, it is not obvious that the junk-code idea would make any difference in this context. If the mutation rate, m, is set to occur for some fractional proportion of the events (between 0 and 1), and if the proportion of exon code to the total code, n, is set, too, then the rate at which exon code is hit by mutations is still governed by m, and is independent of the value of n.
The name of the experiment is an oblique reference to the Miller-Urey experiment in 1952, in which electric sparks, or ultra violet light, were continuously passed through a bottle containing an atmosphere of ammonia, methane and water vapour. Gradually, over a period of days, amino acids, and other building blocks for living cells, formed spontaneously.
The experiment would be left continuously running, day and night. but with some provision for allowing the observer to look for patterns in the memory contents.
This provision could either be on-line or off-line, but must be non-invasive. That is, the observer could either monitor the patterns whilst the program is running, or else by pausing the program long enough to probe the contents of the memory. In the latter case, though, the program should be able to resume as if no interruption had occurred.
The monitoring might either be static (such as the contents of memory plotted against memory addresses), or dynamic (such as the activity on the data bus plotted against that on the address bus).
The format of the display is also up to the student to choose (for example, a 2D raster driven by the high and low halves of the address; a 3D representation; use of colour, symbols; etc.). Anything that helps the user to identify the presence of any evolved structure in the program.
One simple possibility would be to scan the memory, looking for the emergence of two, or more, identical copies of a suitably long sequence. Miller's bottle had only generated amino acids, and so it might be optimistic to hope for the emergence of a-life organisms, rather than just components of them. But why not? Finding two copies would certainly be proof of something interesting having happened; though not finding two copies would not necessarily be proof of nothing interesting having happened.
The student would need to be familiar with microcomputer hardware design and construction. The project is heavily centred on the assembly, testing, and frequent redesign of the monitor and mutation mechanisms.
Since GAs and GPs are so good at being attracted to pattern when they find it, this experiment could be adapted for use as a pattern detector.
The signal from some continuously running apparatus (such as a seismic detector, a neutrino detector, the traffic in the comp.ai.philosophy newsgroup) would be connected to an input port in the computer. At first, this signal will affect the program randomly, like the noise generator that was described earlier. Eventually, structures should evolve in the program that will harmonise (in some way) with the incoming signal, as a defence against being disrupted by it. The structures that evolve in the memory, therefore, will somehow mirror any structure that exists in the incoming signal.
The simplest idea would be to use the signal source in place of the random noise generator. However, the computer might also need to be given voluntary access to the signal (via an easily located input port, for example), as well as still incurring occasional forced mutation during instruction fetches. It should also be possible to disconnect this signal source, and to connect the random noise generator back into the circuit, to act as a control experiment against which to compare the results with those obtained with the signal source.
T.S.Ray (1991). "An approach to the synthesis of life", Artificial Life II, Santa Fe Institute Studies in the Sciences of Complexity, Vol. 11, eds C.Langton, C.Taylor, J.D.Farmer and S.Rasmussen, Redwood City, CA, Addison-Wesley, pp 371-408.
R.Dawkins (1991). The Blind Watchmaker, Penguin Books, Harmondsworth, UK, ISBN 0-14-014481-1.