VAR logo English
Home page Services Past achievements Contact Site
map
Page d'accueil Services Réalisations précédentes Contact
Français


How Might the Next Machine-Class be Built?

Although the previous chapter was on the prospect of implementing artificial consciousness, the overall thesis is on the prospects for someone soon inventing the next machine-class (MC), to spark the next industrial revolution (the next member of the sequence consisting of the wheel, steam-engine and computer, which are the generic classes of machine that sparked previous industrial revolutions).

Continuing on from the second chapter, this chapter considers the implications of cycles within cycles, and the interplay between convergence and divergence.

At the core of all of the models of program execution is the notion that the hardware must provide provisions for communication, memory and processing. Of these, memory is the one that keeps appearing in the context of the second law of thermodynamics, and leads to a consideration of the various means of Implementing communication, memory and processing and the possibility of considering processing as a type of communication. A catagorisation of the three organs is proposed under the headings of topological, passive and active.

Cycles within cycles

There appear to be many points in common between time and frequency domains of Fourier analysis, holographic wavefront encoding, keys for encryption/decryption, and Holonomic brain theory. Also, difference engines, Enigma bombe settings, the layers of Ockelford's zygonic analysis, seeking to isolate the time-invariant components, from the progressive development components. The inhabitants of a universe with a broken symmetry might be hard pressed to recognise what has, to them, become a hidden symmetry (NS, 03-May-2014, p36). In many ways, the aim of the physicist in searching for invariant properties is the inverse process: finding a way of looking at the system that has all its symmetry restored.

Cycles on top of other cycles appear everywhere. The rotation of the planet imposes all sorts of diurnal circadian cycles in our physiology and psychology, and in most of the macroscopic living things on the planet. The tilt of the planet in its orbit similarly imposes all sorts of seasonal cycles. Beyond that, there are cycles of life (as the village elders advise the youngsters in the village) and, if you have access to the core samples, modern science sees geological cycles. Then, in the other direction, there are internal machine cycles of the body's respiration and heart beat.

Ptolemy needed to introduce 'cycles within cycles' as correction terms in an erroneous model. In a similar way, polynomial correction terms can be added to an initial linear model, such as that of the expansion of iron bars, or the electrical resistance of Nichrome wire, when heated (orovided that successive terms are convergent):

LT = L0.( 1 + α.T + β.T2 + γ.T3 + ... )
This is related to the polynomial representation of the Taylor expansion of an arbitrary function (again, provided that successive terms are convergent). Such function decomposition leads back to Fourier transforms, and cycles upon cycles. It is noteworthy how the search for the meaning of life (the cycles of getting up in the morning, when we will onlyg have to go back to bed again in the evening) end up getting merged with a tool for analysing how complicated functions, in terms of simpler elements. Moreover, if anything sticks out as being surprising, and not part of the expected cycle, in a piece of music for example, the human mind instinctively sets out to find the root of that imposed pattern.

Examples of discontinuities in the circular pattern are exhibited in linear space of the earth's time zones, periodic table, and the so-called circle of life; and in the exponential space of half-lifes, the forced truth-table bits of the MAL2 processor, a seven-octave circular piano keyboard. The last of these relies on the periodically repeating pattern within the repeating octaves, picked out by means similar to those of Lissajou figures, and Cantor's method of locating the rational numbers, albeit needing a symmetry-breaking reference point (such as 440Hz, or the meridian that passes through Greenwich) to be defined within this. These structures, then, become obstacles to later waves, thus forming further structures through interference patterns, for example.

A clock face is represented as a circle, but is in reality a helix. Akin to wormscrew gearing and barber's poles, and ions travelling around a tokamak. There is a discontinuity at each revolution, a jump back to the earlier groove. This would happen, too, on a circular piano keyboard. The international date line is the discontinuity when mapping time out on the planet's surface. Birthdays, anniversaries, and number of Christmases or seasons in the sun are the discontinuities when mapping time out around the planet's orbit. The real and imaginary parts of exp((a+i.b)t).

Convergence and divergence

A pencil balanced on its point starts by breaking the symmetry of the situation (divergence) but folows through with seeking the least energy position (convergence) converting potential energy to kinetic energy, and ends with the kinetric energy being dissipated (divergently) as heat and sound. A microprocessor, or long-case clock, leaves out the initial bit, and is always falling convergently. (The pencil balanced on its point can, of course, be replaced by other things, such as the nucleus of a radioactive element that is teetering between decaying now, or not just yet, and takes it into the world of chaos theory.)

From Schrödinger (1944) we can map out:
  • Disorder-from-disorder: such as a stochastically random gas in a balloon
  • Order-from-order: any intricate artefact, such as a microprocessor chip or long-case clock
  • Order-from-disorder (irreversible-from-reversible): exquisitly precise behaviour can emerge from a seathing mass of random movement through convergent behaviour and preferential sieves
  • Disorder-from-order: as it all ultimately reverts back again, and all systems eventually turn out to be chaotic (NS, 29-Sep-1990). "Chaos can emerge in the simplest of situations, even if only three particles interact with each other," notably through the three-body problem (NS, 04-Apr-2020, p19). All macro objects, including protons, are just clusters, with a characteristic half-life, and will eventually decay.

With the second law of thermodynamics, energy naturally, and spontaneously, flows from regions of surplus to neighbouring regions of dearth. This is a convergent process: a ball at the top of a hill will inevitably roll down, at some time or other. There can be a divergent element to the process: which way down the hill, which direction of fall for a pencil balanced on its point, which fragments of a previously intact porcelain tea-cup falling to the floor. This divergence is a red-herring, and not really a demonstration of the second law of thermodynamics: a computer processing unit will always fall down a highly controlled convergent path in the course of program execution, to the extent that program verification tools can predict how it will execute every time. Similarly, then, milk diffusing in cups of tea, ink diffusing in a glass of water, possessions jumbling in a bedroom, are equally red-herrings. What is diverging is a proportion of energy dissipated: the ball at the bottom of the hill has lost some potential energy from when it was at the top, similarly the falling pencil, the fragments of porcelain, the particles of milk or ink, and the microprocessor drawing energy out of the battery. Each of these can be reversed, but only by a system that is, in turn, divergently dissipating energy. So, the whole system (the ball being pushed up the hill again by a bull-dozer, the oscillating pendulum being kept in motion by the descending weight via the escapement mechanism, the computer being forced to execute backwards, when viewed from far enough away, simply looks like a convergent system divergently dissipating energy. With radioactive decay, the decay process is convergent and heat dissipating, like a pencil having been balanced on its point, falling down the chosen fall-line; but which of those atoms? which particular fall-line for the pencil?

Convergent processes are creators of hidden symmetry, while divergent processes are symmetry-breaking (needing extra bits to specify the initial conditions). The pencil balanced on its point is in a position of astable equilibrium, and can fall in any direction with equal probability. Once it starts to fall, though, and is not restored by local perturbations, it converges ever more strongly on continuing to fall in the same direction. When view in reverse, in the cine projector, the pencil will converge on the unique upright position; the broken porcelain teacup (or scrambled egg) will assemble like a jigsaw puzzle, on to the only one correct solution; and the block of uranium and lead will converge towards being a block of uranium, albeit not necessarily starting with the most recently decayed atoms. Indeed, playing the cine film backwards in the projector, the system converges on a symmetric state (bits are not required to remember the initial conditions, such as which uranium nucleus had decayed next, or which direction the pencil fell in, but there is still the question of when the event happened (and is now unhappening in the projector).

Memory

T-digram

There are three functions that are common to all computers, imperative or declarative, that have ever been built, designed, or merely contemplated: processing, memory and communications. Each one can be characterised by measures of latency and throughput (where the throughput tends to be called the bandwidth, in the case of communications, and the latency tends be called the execution time, in the case of processing). Communication involves the transfer of information over space, and memory involves the transfer of information over time (albeit only in the forward direction). It is not surprising, therefore, that the two are similar, and are analysed in the same way in Shannon information theory: represented on a "T" diagram, with input data coming in from the left of the diagram, output data leaving from the right, and noise being injected by the medium from the third arm. It is tempting to wonder whether processing, too, can be considered in the same way, as a type of communication, though also noting that processing is the odd one out, since the other two are already united in their reference to the four dimensions of space-time.

The three types of implementation of memory can be categorised as: communications-based (delay line), pure memory based (decay law), and processing based (regenerative gating). For the first two, unintentional losses, and divergent collisions with molecules of the environment, lead to an exponential decay-law and corresponding time-constant; while, for the last case, the unintentional losses are compensated for, by energy input, and still converge on a stable state.

Delay-line memory is an example of how communication in space can be turned into memory simply by looping the communication channel back to the transmitter, and demonstrates the intimate relationship between the way we measure time, and the entropy increase that is involved in memory storage. Similarly, communication in space can be implemented by decay-law memory. For example, ink marks on a sheet of paper can be physically transferred in the postal system, and USB memory sticks can be carried to a new destination.

The wiring, Z=NOR(X,Y), results in the loss of information (if Z is a 1, the values of X and Y are known precisely, but if Z is a 0, there is only a 1-in-3 chance of guessing the correct values of X and Y), so, the "Qbar=NOR(Q,S); Q=NOR(Qbar,R)" SR flip-flop circuit uses this to implement a standard memory cell. It also shows that the laying down of memory, the direction of the processing, and the information destruction of Landauer's principle, are all consistent.

Flip-flop

Implementing communication, memory and processing

Communication in time is memory, but in space is just communication, with a characteristic v=dx/dt. With a servo-motor, that v can be suddenly brought to zero, or reversed, to put the read-head at the chosen position. With processing, the power of the computer, or work done per second, is measured in instructions per second (like distance per second) times the force of the average instruction (the language level of the instruction set). Like the servo-motor, though conventionally run at top speed in the forward (convergent) direction, it can in principle suddenly be brought to zero, or reversed in the backward (divergent) direction. The convergent direction is used to reduce masses of raw data down to headline summaries on PowerPoint slides, or a list of integers into a sorted list (theoretically, a reversible process, since no integers are destroyed), or lumps of radioactive metal into decay products. (Nuclear reactions are easier to engineer to run backwards than chemical reactions, since it is not necessary to collect all the ingredients back again, where antiparticles can simply be emitted instead. However, it does still involve the convergent process of collecting in dissipitated heat, and concentrating it back in the atomic nuclei.)

The wheel cycle is achieved by one spoke-like lever after another. A heat-engine's job (to get from A to B, of to raise a load to height h) is achieved by multiple cycles of the component wheels (cranks and cams included). A computer's job (to sculpt away the extraneous information) is achieved by multiple heat-engine jobs.

Biological tissue is composed of heat-engines, lifting loads from high-entropy points to low-entropy points. A biological organ is composed of multiple tissues, each building a separate pile. A biological organism is composed of multiple organs, each building a separate structure (along the lines of Deacon's teleodynamic systems).

  Communications Memory Processing
Implemented by Communications Communications Since relativity and TD2 ensure that spacial communications is also a transfer in time, a delay-line loop can implement memory Cannot be done, since passive filters, alone, cannot make an active one
Implemented by Memory The information is first stored within a physical device (a sheet of paper, a memory chip or floppy disk) and then is carried to its destination Only dynamic memory is pure memory (capacitive DRAM, EEPROM, Flash, inductive core or bubble); the others are but engineering emulations Cannot be done, since passive filters, alone, cannot make an active one
Implemented by Processing The side-effect of getting information from A to B, provided that it is configured as a passive filter, not changing the data that is passing through (the semantic input is set to 1 or ‘Id’) Since forward and backward computation both have positive execution times, a level-restoring double NOR-gate loop, with parasitic capacitance, can implement memory Processing

Processing as a type of communication

The communication theorists' coat of arms, taking on the shape of a T, can be treated as a block in a circuit diagram, with a stream of 1s and 0s coming from the input, being merged in an exclusive-or gate with a stream of 1s and 0s come from the noise input.

[Input]--->---[channel]--->---[Output]
                  |
                  ^
                  |
               [Noise]
  
T-digram

Shannon's great insight, though, was to abstract away from this. The input can be represented just as a 2x1 matrix of probabilities of a 1 occurring, or a 0, and the noise input by a 2x2 matrix of the probabilities of a 1 being corrupted to a 0, or vice versa, or to go through unchanged. For this, the operation represented by the block at the node of the T is that of matrix multiplication.

The Shannon "T" diagram can be extended to "TTT", where the middle "T" still represents the point at which noise is being injected by the medium, but the first "T" injects a transformation (G) on the input data, and the last "T" injects the inverse transformation (H) to reconstitute the original data (Moser and Chen, 2012).

T-digram T-digram T-digram

Traditionally, over a low-noise medium (on the second "T") Gl can implement data compression, and Hl its inverse decompression; while over a high-noise medium, Gh can add a redundant coding, and Hh its inverse decoding. So, perhaps processing can be modelled using a similar diagram, via the G or H function. Since the action of a computer is to reduce information content, somewhat akin to Michelangelo chiselling out the redundant information to reveal the inner sculpture, we could either model the microcode of the processor as Gl in the low-noise medium, or Hh in the high-noise medium. The former would have Hl as some sort of anti-processing function, able (hypothetically) to beat the Turing Halting Problem. Alternatively, the latter model would have Gh as some sort of obfuscating process. An example of this might be, for example, the universe obeying very simple laws, like those of the principle of least action (Coopersmith 2017) but presenting very complication emergent behaviour to us on the first "T", with Tycho Brahe's telescope at the second "T", and Kepler's tomographic processing of the data back down to simple laws at the last "T".

One question that this then poses is whether it is necessary to consider the inverse transformation, even if it is not used in general practice. This would only be possible if the CPU is implemented as a reversible computer, using Fredkin gates rather than information-destroying NAND and NOR.

Both Shannon and Turing abstracted away from the details of the implementation. A channel carries information, regardless of wwhat that information might be (and there are no violons inside the jukebox); a computation proceeds through the steps of the algorithm, regardless of the hardware that is being used. The locomotive could be transporting information, in its train of carriages, from Paddington to Temple Meads; working as a UTM though, that information never arrives there, and is merely shuffled up and down the Turing tape. So, perhaps processing is not a type of communication, after all.

If the information in the carriages is used en route, it becames part of the state machine. Analogous to Kirchhoff's current law, if more information is flowing in to a point than is flowing out, the surplus mpst either be going in to storage or else leaking out as dissipation. (if more information is flowing out of a point than is flowing in, the surplus mpst either be coming out of storage (and erased from there) or else leaking in from the outside, either way involving the in-flow of energy.)

At the next MC level on from this, are we talking about semantics, meaning, empathy, beliefs, consciousness? Perhaps simply arranging for the steam locomotive on the track to author a story. Stories acquire new information each time they are retold (Alan Turing sipping the froth of his glass of beer, Kate Bennett saying "save your breath to cool your porridge", Dawkins on the use of "virgin" in place of "maiden".) Just as a pencil balanced on its point acquires new initial-condition, breaking-symmetry information as it starts to fall. Jane Austen dramatisations presented as nice and rosey stories, rather than as agreessively biting observations of people, or orchestrations of Bohemian Rhapsody as sweet gentle tunes.

Non-commutivity will arise when there are side-effects involved. (This is not a necessary condition, though, since symmetry can lead to non-commutivity in the sequential quadrant rotations of an object about its x, y and z axes.)

In by-gone days, I wrote my files to floppy disk at night, communicating them to the future, to read back the next day (perhaps on another computer). When I read my files from floppy disk the next day, to write back that night (having edited or otherwise modified them) was I communicating in some way? I would say “I am editing” or “I am compiling” or “I am processing”. "To edit" is to merge two streams of data: the old version of the syntactic information from the disk, and the incremental semantic changes within my head. “To compile” is to merge the syntactic source code information (already on the disk) with semantic rewrite rules embodied by the compiler. "To process" is to merge the syntactic input file with semantic rewrite rules embodied by the instruction set. Processing is necessary for presenting information (such as tables, graphs, lists, and diagrams); Syntactic-compression, too, for transmission or storage. The informed user sends information to the uninformed user in a different mind-set. One user has raw data, and has to work out the results (maybe in his head); the other is given the results, ready and all worked out. Processing is necessary to establish relevance. Michelangelo still needed to chip away the unwanted stone, to reveal the already existing statue. You want to learn about the Cretacianeous period? Here is a mountain of limestone layers. You want to learn how to use your HP Vectra PC? Here are the circuit diagrams and object code. You want to learn about parquet flooring? Here is an un-indexed pile of back-issues of DIY magazines. You want to learn about the Norman Conquest? Here are the current positions and velocities of all the particles in the universe. This is related to the question of whether chaotic outcomes, such as the value of the next pixel in a Mandelbrot set, is pre-computable like the outcome of a coin being tossed.

  Communications Memory Processing Contextualisation Empathy
Speech and I/O passive topological (delay line)      
Writing and storage active passive topological (mapped on the page)    
Processing and CPU   active (regenerative) passive topological (mapped on the page)  
Semantics and Next MC     active (Platonic space) passive topological
Empathy and MC after that       active passive

Topological, passive and active

Speech: an invention between the lever and the wheel; for communicating information (communicating it in space). Writing (and later printing): an invention between the wheel and heat-engine; for communicating information, and of holding it in memory, too (communicating it in time, albeit only forwards). The telephone (and later email): an invention between the heat-engine and computer; for communicating information, and memory and processing (with deductive logic in the telephone exchange). The Semantics System: an invention between the computer and the next MC; for communicating, memorising, processing information, and something else: contextualisation, perhaps. The Empathy System: an invention between the next MC and the MC after that.

Human speech is passive communication: the message appearing over time (Table above). Human speech is topological memory: folk history passed by delay-line word-of-mouth. Human writing is active communication: written communications are more concrete than spoken ones; and written contracts more than oral ones. Human writing is passive memory: ink on paper has a time constant. Human writing is topological processing: the successive states of a computation mapped out across the page (Table below). Human processing is active memory: regeneratively repeating something over and over while it is still being needed. Human processing is passive processing: by the same token as the naming of passive communication and passive memory. Human processing is topological contextualisation: whatever the next MC does. The human equivalent of the next MC is active processing: mathematicians discover truths in Platonic space, perhaps. The human equivalent of the next MC is passive contextualisation: whatever the next MC does. The human equivalent of the next MC is topological empathy: whatever the MC after that does.

In speech, it is the human mind that provides the memory; passing down a story by word-of-mouth. In writing, it is the human mind that provides the processing. In computing, it is the human mind that provides the semantics, appreciated by the programmer, not the computer. The value '6', in 'CELL', is ungrounded without the external context: is it six dollars, six balls in a cricket over, six sides for a polygon, sixth time round the loop, a RTT instruction, or something else?

  a b r
  33932 68034  
1: r := a mod b     33932, 170, 102, 68, 34, 0
2: a := b 68034, 33932, 170, 102, 68, 34    
3: b := r   33932, 170, 102, 68, 34, 0  
4: IF b > 0 THEN GOTO 1      

This rolls out in time as follows. The column on the left contains the contents of the program counter (which might or might not be memory-mapped, cycling six times over the labels for L1, L2, L3 and L4; each cell can either show no change in its value, or a single new value written in; and the four instructions are listed horizontally along with a, b and r (assuming a stored-program architecture) just as an array of variables would be depicted, indexed by the program counter, but with values that are never changed (since there is no self-modifying code).

PC a b r L0 L1 L2 L3 L4 L5
L1 33932 68034   NOP r := a mod b a := b b := r IF b > 0
THEN GOTO L1
HALT
L2     33932            
L3 68034                
L4   33932              
L1                  
L2     170            
L3 33932                
L4   170              
L1                  
L2     102            
L3 170                
L4   102              
L1                  
L2     68            
L3 102                
L4   68              
L1                  
L2     34            
L3 68                
L4   34              
L1                  
L2     0            
L3 34                
L4   0              
L5                  

On to the next chapter or back to the table of contents.

Top of this page Home page Services Past achievements Contact Site
map
Page d'accueil Services Réalisations précédentes Contact
© Malcolm Shute, Valley d'Aigues Research, 2006-2024