VAR logo English
Home page Services Past achievements Contact Site
map
Page d'accueil Services Réalisations précédentes Contact
Français


Principles of limitation

The corner-stones of the theory of operation of the previous machine classes have come down to principles of limitation: the Turing halting problem for computers, Heisenberg's uncertainty principle for mechanisms. Constructing the next machine class by building machines upon machines and contemplating cycles within cycles bodes well that something similar might be done for the next machine class. This chapter starts by considering latency and throughput, and the dissipation of matter (the flux of molecules in and out of an object), dissipation of energy (appearing as the second law of thermodynamics.

It is possible to generalise the application of half-life as a statistical parameter, and to assign meanings to negative and imaginary half-life. Being a statistical bulk parameter, it leads to the laws of cooling, with implications of forced reversibility), and dissipation of information (appearing as decoherence and the measurement problem).

The principles of uncertainty and incompleteness can be generalised. Human consciousness seems to have a need for the notion of Now in the same way that an imperative-language computer programs needs a program counter. This generalisation leads to the possibility of it being used as a tool of speculation.

Latency and throughput

All the MCs so far, not least the computer, have three main organs: processing, storage and input/output.

A chemical reaction is a matter handler, taking in matter (the reagents), processing it (the reaction itself), storing it in reservoirs, but also in the delay-line capacitance of the pipes (and barges on canals), and outputting the result (the reaction products).

A heat-engine or transducer is an energy handler, taking in energy via the fuel inlet, processing it (converting chemical energy to kinetic) in the fire box and in the cylinder, storing it in the fly-wheel and boiler, and outputting the result via the drive shaft.

A computer or logic gate is a syntactic information handler, taking in syntactic information through the resistance buffering, processing it in a cross-coupled transistor pair, storing it in the parasitic capacitance of all the internal connections, and outputting the result in the amplifier chain. Or input, output and storage of source and object codes, while keeping the semantics unchanged. Processing using rewrite-rules (and pedagogical pruning of the message for the assembled audience).

A beliefs machine will be a semantics handler, and an emotions machine an empathy handler, taking it in, processing it, storing it, and outputting the result. Original and final stories, in which the semantics have been changed, while keeping the empathy unchanged.

T-digram

With the proposed fourth law of thermodynamics, some of the flowing energy goes into temporary storage, and hence takes time before it is eventually released. Whenever there is less of anything (energy, matter, whatever) coming out of a system (throughput) than there is going in, the difference must be delayed in internal storage (latency) and must be growing in magnitude. As matter accumulates, the pipeline becomes heavier; as energy accumulates, the energy store not only has more energy to release again, but it becomes hotter, scaled by the heat capacity of the system; as information accumulates, the memory bank becomes more occupied, but little else changes (in any case, information can be copied as it passes out of the pipeline, so still retained, in a way that matter and energy cannot).

Dissipation of molecules

All structure is temporary, and will eventually break up. A chair is in a flux of exchanging surface molecules with the air, the floor, and the sitter's bottom. Structured objects are just a convenient modularisation in the mind of the human observer, to cut down the amount of computation required (handling the chair, mentally, as a single particle, rather than tracking all 1024, or more, of its constituents invidually). By this argument, even the proton has a half-life, albeit possibly longer than the lifetime of the universe. To be considered as a structure, though, there must be some persistence mechanism at work, like a potential energy well, pulling the components back together, until one day chaotic behaviour, or a freak event, sends one of them flying over the rim of the well.

Each structure serves as a stable platform on which further structures can establish themselves (NS, 27-Apr-2019, p53), either applying restoring forces, or adding further structure to the previously already stable platform. Stable patterns building themselves on stable platforms is to bottom-up design, as divide and conquer is to top-down design (NS, 10-May-2008, p52) but with the added notion of emergent behaviour. For the human observer, this is partly abstraction to allow us to handle the dimension of increasing complexity (NS, 14-Feb-2009, p36). However, it also genuinely describes systems with different laws, a top-down causation (NS, 17-Aug-2013, p28). This is even apparent at the classical level. For reasons of abstraction, we give labels to classifiable objects (chairs, rivers, human beings) wherever there is a convergence, establishing and maintaining those groupings. A river is an identifiable object, whose positions and movements over several decades can be predicted, given a knowledge of the geology of the region, but a detailed knowledge of the position and velocity of each and every one of the 1038 (or so) water molecules over the decades would not improve the model. Although the generic physical and chemical properties of water molecules do need to be taken into account, we are used to the river itself being a stand-alone object that cannot be reduced to a description of the positions and movements within the lower-level. There truly is an emergent behaviour going in the bottom-up direction that thwarts any attempts at reductionism in the top-down direction. The process of reductionism fails whenever the turn-over of component particles can be substituted interchangeably or merged into and out of the background (through evaporation and condensation for example). It can be noted that the human body, too, consists of a continuous turn-over or flux of atoms and molecules (NS, 12-Dec-2020, p36).

Dissipation of energy

As a cue ball travels across a snooker table, and strikes a glancing blow on a red ball (where, for example, the cue ball could represent a molecule of hot gas, and the red ball could represent an atom of a metal piston) the collision transfers some of its momentum to the red ball. But if the cue ball undergoes this when striking the red ball, it must also be undergoing countless mini-collisions with the molecules of the baize and the surrounding air as it traverses the table (and similarly the red ball, once it too is in motion). Moreover, at the moment of the collision, there is a flurry of internal collisions between the neighbouring atoms within each ball, given that neither is a perfectly rigid body. This is followed by a divergent cascade of collision events between these newly excited molecules with their next nearest neighbours. Every one of these collisions is, in theory, perfectly reversible. However, in any attempt to reverse the whole process, we would naturally focus on reversing the main collision, of the red and white ball, and getting the original energy back from that. It would be beyond our computational ability to arrange for each of the mini-collisions to be convergently reversed. So, instead, we let them be substituted by a roughly equal number of new divergent forward collisions, thereby dissipating a proportion of yet more of the balls' energies. As a result, the collision will not be perfectly reversed, and we will not get back all the energy that we originally put in.

When two molecules collide, and undergo an endothermic chemical reaction, the newly-formed chemical bonds take up the kinetic energy, which makes that region of the substance appear cooler, so heat energy from the rest of the substance dissipates divergently, as normal, into that cooler region. So it is still divergence at work, not convergence.

Dissipation of information

While the module is modeled with bulk paramaters, assuming a statistical approach, a degree of uncertainty due to the statistical averaging process, is inevitably involved. The second law of thermodynamics arises in the human mind, as a consequence of the modular modeling process. It needs to be shown that, just as the binomial distribution can be shown to merge with the Poisson and normal distributions, so it can also merge with the power-law of chaos theory, and the violation of Bell's theorem.

If each event could be tracked individually, the averaging would be replaced by deterministic certainty. Determinism requires the principle of causality to hold (that every effect has a prior cause). This is the principle of unitarity, and requires there to be a conservation law for information, and leads, amongst other things, to the black-hole firewall paradox. The measurement problem is another consequence: the quantum system is described exactly by the wave-equation, until the boundary point at which we consider the atoms to belong to the probing measurement device, and for non-quantum, classical behaviour to start.

The process of decoherence can either be looked on as a steady spreading of the extent of the quantum system, to include more and more atoms into the wave-equations, or else as the dissipative leaking of energy and information out to atoms that are considered to be outside the quantum system. It amounts to the same thing, but the former view can be considered the realm of the reactive spreading, with imaginary components in the exponential function, and the latter to be the realm of resistive dissipation with real components in the exponential function. The latter view dismisses the dissipation as losses, that are no longer tracked in the wave-equation, but are treated statistically as a bulk parameter. It is the realm where the second law of thermodynamics starts to have application; in the reactive view, it had no application. It can also be thought of as the change from science as the viewpoint, to engineering: from detailed tracking of each component, to considering regions as having bulked black-box behaviour that can be used as components of further systems. Decoherence and energy dissipation appear to be the same mechanism, according to this snooker-ball view. Decoherence is just another mechanism for cooling, like conduction, convection and radiation, with a corresponding cooling law, and the idea of driving a Crooks' engine from the mechanism.

Whenever entanglement is increasing (noting that decoherence is entanglement with the individual constituents of the environment) entropy is increasing and, by the corrolary of the second law of thermodynamcis, energy necessarily flows. Constantly milling around, redistributing positions and velocities, with no new configuration significantly different from any other, each one, including the initial starting position which is now long destroyed, unlikely ever to occur again at random. Decoherence is a divergent process, like heat dissipation, caused by information dissipating through entanglement events throughout the environment. Jostling and collisions and absolute zero (NS, 18-Mar-2017, p10).

What is really meant in quantum mechanics by the terms 'observation' and 'measurement' is "confined to an eigenstate" and is similar to the way that any sandy beach will passively interact with the in-coming waves, and cause them to break, with or without any conscious mind being present; or a tossed six-sided die will have its probabilities redistributed by encountering a table surface. The coherent part of the interaction is time reversible, and the incoherent part is irreversible; it is the latter that is considered to contitute an observation or measurement, and which results in the collapse of the wave-function. The extra term that objective collapse theorists add to the Schrödinger equation, to account for the random component of the collapse event, comes more naturally from the decoherence/back-action term (NS, 28-Mar-2020, p34) thereby involving the whole system and its immediate environment in the equation (the measurement probe device, of course, being made of quantum particles, too).

Dissipation in BM/EM

Perhaps all program creation can be viewed as a type of sort process. Indeed, an engineering student in 1975 could maintain a stock of commonly used punched cards, to edit program decks over the weekend, for submission before a deadline on Monday morning (though this never really worked in practice). Writing a computer program, therefore, can be imagined as a type of sort operation on an indefinitely large deck of punch cards, with the wanted cards sorted before the "STOP" and "END" partition, and the remainder left unsorted after it, like junk DNA.

CPU Board

In the The Bottle Experiment, one (or perhaps many) Z80 processor(s) would be wired up on a small circuit board with a memory chip, and an input from the outside world that would inject noise, at a very low level, onto the data bus. This noise would make blank instructions coming from the empty memory chip, look like non-blank instructions. Gradually, the memory would fill up with noise, but with stable patterns establishing themselves (by definition, stable patterns persist, while unstable ones die away quickly). So, Tierra like, it could be left running day and night, for months on end, to see what patterns evolved in the memory. As to what input to connect to the data bus, one thought was to feed it with articles on selected groups of Usenet.

It was uncertain as to whether anything interesting would indeed evolve, and to what (self-asigned) purpose. There are many layers of supporting information missing between the low-level AI's workings, and the surface-layer patterns in tweets and the comp.ai.philosophy newsgroup of Usenet. That was one of the major lessons learned on the Cyc project of the 1990s. The aim was to connect an AI up to an encyclopedia, and see what emerged. One early stumbling block was finding that a large part of the knowledge about the world, common-sense knowledge, was not normally found in an encyclopedia ¤such as, "If the president is in the Oval Office, where in the world is his left leg?")

By this token, therefore, human consciousness is self supporting: it focuses on what is important, but what is important it focuses on, or better, West's Law, 'anything you're paying attention to isn't as important as you think it is.' Pattern emerging at a certain level of granularity, and not at others, by natural means.

Many problems in engineering seem to be that of judging the level of granularity, and the subjective input required to choose the cutoff point. A good example is the design of a Phase-Locked Loop. This is a simple electronic circuit that latches on to the incoming signal, despite any noise that is present on that signal. It includes a low-pass filter, to allow it to smooth over any periods of excessive noise. But who decides the cutoff frequency of that low pass filter? And, therefore, who decides what part of the input is signal, and which part is noise? One man's noise is another man's Ph.D.. Problems about judging between coarse grain structure, and fine grain randomness, between signal and noise, between call-by-value versus call-by-reference (especially in modular document maintenance), therefore ends up as being a manifestation of the Turing Halting Problem (and not resolveable in a generic, waving of a magic wand sort of a way).

Half-life

The half-life of an event is the period of time over which, statistically, the chances of the event happening are 50%. Traditionally, it is applied to the decay event of a radioactive atomic nucleus. However, it can be equally applied to the chances of two molecules undergoing a chemical reaction, or simply one vibrating molecule passing on some of its vibrational energy to a neighbour (thermal conduction). In all cases, it involves the tendency, under the second law of thermodynamics, for all things to fall into a position of least energy, and not be able to fall out again; and as de Bono observes, the tendency of things not to unhappen is the basis of memory, and hence ties it in to Landauer's principle.

An antique porcelain teacup, or a Roman statue, has an inevitable eventual state of being smashed, but has a half-life on its survival before then, during which it becomes an ever more valuable, and rare survivor. The antique porcelain teacup also has a half-life of remaining in the shopkeeper's window, before it is eventually sold at its exhorbitant ticket price. The half-life is not necessarily a simple constant, but a function of other parameters: such as the quality of the workmanship versus the ticket-price to market-estimate.

Porcelain teacups are tending monotonically to the broken state, as is everything else, including: human lives (where the half-life is a function summed up in swathes of actuary tables); functioning computers; a given law of physics being refuted; buying lottery tickets (or tossing assemblies of coins or dice) until a winning combination comes up; conversion of the universe's matter to iron.

With a black-hole, an external observer can never see material crossing the event horizon, since the signal becomes asymptotically red-shifted, and weak. If this material includes an intrepid astronaut, the external observer cannot know if the astronaut fired his rockets at the very last moment, just before crossing the point of no-return. So, just as Bekenstein-Hawking used black-holes make a link to the laws of thermodynamics, so they can also be used to link to Popper's model of scientific method: we can come up with a theory that the astronaut eventually crossed the event horizon, and this theory can be refuted (by noticing the astronaut re-emerging, perhaps decades or centuries later) but no-one can definitively prove it.

The advancement of human knowledge is a monotonic path, provided there are not too many major set-backs like the burning of the library of Alexandria, or fall of the Roman Empire. Number of attendees joining an on-line meeting close to the alotted hour shows how it is connected, too, to queuing theory.

Negative and imaginary half-life

Working backwards in time from the current number of radioactive nuclei involves a doubling for each half-life step back in time, until some limit value is reached (the known starting population) and is the principle behind radioactive element dating. This highlights how most forward processes (with a positive half-life) require there to be a flux: a starting level of carbon-14 that is maintained while the organism is alive; a maintained level of chemical reagents or biological nutrients; a routine process of rewinding the grandfather clock, refueling the car, replenishing the rounds of drinks during student evening discussions. It is this sense that things are continuously running out, or running down, that gives us the sense of the flow of time. Even the act of replenishing those things merely moves the problem to something else being run down.

The perfect pendulum has a half-life that is imaginary. The resistive (real) component of the descending weight part of the clock is using its energy store to maintain the timeless, reactive (imaginary) component of the pendulum part in its original low entropy state. Similarly, the metabolism of a living cell is able to maintain the contents within the cell membrane, behaving like the grandfather clock, simply as a machine that just happens to find itself existing.

Laws of cooling

The laws of cooling (Fourier's, Newton's and Stefan's, respectively for conduction, convection and radiation) capture much of the zeroth, second and third laws of thermodynamics: the net flow of energy is zero if there is no temperature difference between the parts, the rate at which energy flows from the colder part to the hotter part is a negative quantity, and hence there can never be a place that can cool a region down to absolute zero. Moreover, the rate at which energy flows from the hotter region to the cooler one is proportional to the temperature difference between the two regions. Rate of energy flow is power, and is automatically positive in the direction of flow from the hotter region to the cooler one. Similarly, Crooks' fluctuation theorem: if the dynamics of the system satisfies microscopic reversibility, then the forward time trajectory is exponentially more likely than the reverse, given that it produces entropy. In space, the energy flow from the hotter region to the cooler one manifests as a force.

One source of asymmetry is time dilation in general relativity, as opposed to the symmetrical view taken of it in special relativity and Newtonian physics. The laws of cooling are another, with a built-in double negative to impede energy flowing from cool bodies towards hotter ones. Of the four axes in space-time, we tend to think of time as being the odd one out, in that we are not free to move backwards as well as forwards; but maybe it is time that is the normal case, and the three spatial axes are the odd ones out. There are many parameters in physics that are unable to take on negative values: the magnitude of a vector; the dot-product of a vector with itself, as in "½m.(v.v)"; the amount of energy or mass; the frequency of a wave, the pressure, volume, temperature, number of moles of substance, amount of substance being monitored in a clock (water, sand, chord supporing a weight), enttropy change, and time. These are constrained to being positive quantities, but position and momentum in the three spatial axes can take on negative values (or positive values but pointing in the opposite direction), relative to some arbitrary origin.

When a parameter has no valid meaning defined for it for negative values, it suggests that the parameter should be viewed logarithmically, ranging from 10-p to 10+p. Indeed, each report of a new world record in temperatures achieved in the laboratory (highest or lowest) is reported using such 10+pK or 10-pK notation. One possible implication is that our linear scale for measuring time is arbitrary, too, and merely a human invention, and should be measured on a logarithmic scale, as we are happy to do when measuring pH. Negative values of linear time would then be just as imaginary as a point that is north of the north pole. If this were the case, though, our physics experiments and Noether's theorem should have shown it up. However, in all our real world values, 't' is just a miniscule deviation either side of the 13.8 billion year base in the Taylor expansion ln(1060)+ln(1+t), and hence might explain why we think of the time dimension looking just-about linear.

There is also a built in bias in our definitions of those parameters in physics. We think of acceleration as being due to an applied force, but this means that where there is an acceleration (due to time dilation for example) it automatically appears to us to be due to a force (gravity in this case). Similarly, a de facto expansion of space-time appears to us as a velocity, and its acceleration as being rooted in a dark energy, and as noted above, regions of increasing entropy must therefore contain an energy flow.

Forced reversibility

Since the second law of thermodynamics freatures so heavily in the above discussion, it is instructive to explore the implications of what would happen if it were possible to have a house inside which entropy and time runs backwards. People could place things just inside the entrance (broken teacups, decaying bunches of flowers) and pick them up the next day fully restored. From our perspective, the house would need a power supply. Meanwhile, the neuron machines of the occupant of the house would be thinking forwards from the moment of taking in the restored objects, then watching them decay, and on to the moment of putting them out at the door. Looking out of the window, it would appear to him that it was for the rest of the universe that time was running backwards, and necessarily connected to a power supply. Indeed, it would appear to be his house that is supplying at least some of that energy, via the power-supply connection that we outsiders think is supplying his house. Plus, he would see his energy source as being the heat coming in through his roof. However, to paraphrase Viktor Toth, on Quora, our universe is constrained by a past low entropy boundary condition, while the house would be constrained by a future (as seen from the outside) low entropy boundary condition, and the two are hard to reconcile unless you introduce things like event horizons.

The breaking teacup, rotting bunch of flowers, and clockwork Turing machine executing BubbleSort on a list of integers, are all convergent processes, but generating heat (whose dissipation is a divergent process); reversing the former can be framed as a different convergent process, but the reverse of the latter cannot be framed as a divergent process. To the occupant of the house, the power source is the heating coming in through the roof, and the dissipation is out through the mains cable, so convergent. Perhaps we could get round this simply by tracing that low entropy origin back round, via the energy input through the roof, to the CMB or Big Bang of the same external universe as we use. After all, we do not perceive our heat dissipation as powering the rest of the universe, and for a passenger in a wind-up clockwork car, the mass of the car would be decreasing as the energy of the spring is used up, seemingly powering the rest of the world to whizz by, but the human mind does not default to this perspective, but tries to imagine a third-person, objective view of the car viewed from outside.

Inside a living cell, the cell membrane stands for the walls and roof of the house, and the DNA and protein computing engines stand for the occupant of the house. Any damage to these is repaired by that machinery, keeping the entropy low, at the expense of having to take in low entropy nutrients through the cell membrane, and to expel high entropy waste products. To those proteins, it feels like the clocks are constantly being set back. Like the reseting of a water clock, or a graduated candle, the unwanted DNA computation is constantly being re-initialised. This sounds like it could be the reversal process that is being sought in the steam locomotive Turing machine analogy. Except, for that to work, it would need to be a reversed computation period, not simply a reinialised computation event. We could ask what would happen if we occasionally stop the computer, mid execution, and force it to run backwards for a while, before resuming its forward run from that new point. The forward running computer is on a convergent process (there is only one correct list of those integers arranged in ascending order) so the backward process is divergent (there are many many ways to unsort, or jumble, the list). In art, this manifests as the creation of tension and resolution. In Western music, motifs of notes arranged in an ascending phrase followed by motifs of notes arranged in a correspondingly descending phrase. It seems to mimic the tone of the human voice expressing a question, and then offering a response. It also seems to echo the need for a circle of dancers to need to do an initial routine of steps, followed by their reversal, to bring everyone back to their initial positions, before finally moving on to the next stage of the dance. Just as Jane Austen's Pride and Prejudice, or Mozart's Turkish March, or a dance round a Maypole, are abstract concepts, compared to the details of their content, so perhaps a forward and backward running computer program might carve out a more abstract object in some design space.

Principles of uncertainty and incompleteness

Wave-particle handler

Fourier analysis gives Δk.Δx≥1 as an inherent limitation, where k is the wavenumber (the spatial frequency) equal to 2π/λ: the sharper and more precise a pulse is in time (or laid out in space) the wider the band of frequencies needed to define it; conversely, the more discerning a spectrum analyser is, the longer the sample needs to be to categorise the wave. Then, with just one result from quantum mechanics, E=hf, Heisenberg's uncertainty principle drops out: ΔE.Δt≥ℏ/2. Within quantum mechanics, this result conveys the discrepancy between measuring one parameter of the conjugate pair (the energy of the system, say) before the other (the time), compared hypothetically to doing it the other way round.

We think of particles as being pointlike and interacting locally, while waves have their information smeared out non-locally and holographically (at the very least throughout their wavelength). As we shrink the scale down from snooker balls to quantum particles, the distinction starts to blur, and any point-like particle must have a vague uncertainty to it. Even though, before and after a collision, the components follow the usual laws of motion, the bifurcation itself manifests as a discontinuity at the arbitrary moment, t0. However, this discontinuity cannot exist in practice. Moving the window of interest so that it straddles t0, there must be a smooth and continuous transistion in all of the infinite set of derivatives (x(t), dx/dt, d2x/dt2, d3x/dt3, d4x/dt4, ad infinitum). The failure of the rigid body approximation can account for this, through the speed of sound limitation within the material of the snooker ball (which cannot exceed 36 km/s (NS, 17-Oct-2020, p10)) or the flexing and hysteresis of the mechanical linkage of the components of the car. In a particle collider, the t0 event can be blurred through Heisenberg's uncertainty principle and quantum tunneling.

Information handler

Gödel's incompleteness theorem or equivalent (NS, 14-Aug-2010, p34) leads to the Turing Halting Problem for computation, and the universal Turing machine was concocted as a means of illustrating what undecidability might look like in the Entsheidungsproblem.

The wave-particle duality has echos of the distinction between continuous and discrete (quantised), and Heisenberg's uncertainty principle and Gödel's incompleteness theorem have a seductively similar feel to them. The single-shot pulse displays the lowest-level appearance of modularity (one side of the leading edge is part of the pulse, the other is not).

Heisenberg's uncertainty principle can be expressed in terms of information theory (NS, 23-Jun-2012, p8) with the momentum of the particle conveyed in one message stream, and its position in another, and noting that being able to decode both message streams would yield so much information that it would be tantamount to violating the second law of thermodynamics. Just as with Gödel's incompleteness theorem, quantum physics is unable to describe itself (NS, 23-Mar-2019, p28) for example in the measurement problem and an extension to the Wigner's friend thought experiment. Omnes (1990) notes that if the universe is to be treated as an information system, we must first establish its basis in logic; Heisenberg's uncertainty principle would then drop out from Gödel's incompleteness theorem (the logic that leads to a measurement of the momentum of a particle, contains no statements to describe its position, so any statements about its position can neither be proved nor disproved). The parameters in Bell's inequality are not normal commutative numbers (NS, 03-Nov-2007, p36) but perhaps should be handled as octonion and quaternion numbers (NS, 09-Nov-2002, p30) or as Dirac's non-commuting quantities (q-numbers) not all of which can be simultaneously number-valued (such as the eigenstate for an electron's position, and the one for its momentum). Fractals might also be used to explain how the opposing views of quantum mechanics and relativity might both be correct (NS, 28-Mar-2009, p37) and how some Gödel-like questions about the universe (such as, "what if the experiment had measured the momentum of the particle first instead of its position") might have no answer because they do not lie on the same fractal coastline of some sort of scale-relativity universe (NS, 10-Mar-2007, p30).

Beliefs or emotions handler

Superposition versus unknown position for matter becomes inconsistency versus incompleteness for deductive logic information systems. Believing both things versus not knowing what to believe.

Waves carry, and eventually disperse, energy. They also carry, and disperse, information (not least in the process of decoherence). Do they also carry, and disperse, some other content corresponding to BM or EM?

There are pairwise limitations in many subjects. Even for theories in philosophy, Berkeley notes that they cannot be both constant and consistent, and that they cannot be both completely informative and completely certain. Other examples include risk versus cost, and temperature gradient versus energy flow. There is a possibility that this could be hinting at a possible connection between Turing's halting problem to Heisenberg's uncertainty principle. It suggests how a more general form of the Turing halting problem might be proposed: such as the uncertainty of the execution time, balanced against the uncertainty in the computation information temperature. The computation information temperature could be that of the simulated annealing system, or the mutation jump distance of genetic algorithms. In the case of von Neumann architectures, the temperature is brought down close to zero, hence the enormous uncertainty in execution time caused by the halting problem, though still finite, when asynchronous logic bit-flips are taken into account.

The way that the wave-function of a photon or an electron consists of a carrier wave (eikx) multiplied by an envelope (e–a.x2) raises one way of viewing how reality is structured, with multiple levels of modulation. In human communication, we have the RF carrier wave modulated by an AF tone, and the AF tone modulated by the irregular crystal structure of Morse or ASCII code (analysable in terms of Shannon entropy) which in turn is modulated by the structure of the text that is being sent (such as that of one of Shakespeare's plays). This might then be broken down into multiple layers of structure, meaning and artistic value. In all these examples, though, the climax (of a symphony, or of a play, for example) can only be appreciated by first experiencing the rest of the work. In the other direction, St. Augustin noted the relationship between hearing the current note of a phrase, within a tune, and the memory of the melody up to that point, and the anticipation of the patterns to come. This is related to the concept of Now, and the need to live out the moments out one by one.

The need for Now

Relativity and quantum mechanics handle the concept of time in two different ways, but still neither has a satisfactory way of explaining what constitutes our feeling of the moment that we call, Now. Art seems to be important to humans, and hints at having something to say on the issue (a finger that points at the word being read, the note being played, and the line of a computer program being executed). Indeed, science is very related to art, as Da Vinci and Galileo took for granted, and very much rooted in the way that the human mind does its thinking through symbols, monologue, and framing of stories (even a scientific paper in a journal is written in the introduction-body-conclusion form). The distinction between the book "Pride and Prejudice" as a whole, and the current placement of the bookmark in my personal copy; or the BubbleSort algorithm, its current position in the sorting process.

A computer program, implemented in software, can be viewed as an emulation of real (albeit perhaps exponentially complex) hardware. As such, it has a black-box behaviour: we present it with inputs, and expect it to reply with outputs. We know there will be a delay between the inputs and the outputs: as well as the speed of light limitation between the two sets of ports, we do also allow for an execution time, but we do not intuitively allow for Turing's halting problem. Perhaps the execution time could be modelled as a half-life, with some algorithms halting earlier than expected, and others having infinitely longer lives than the half-life of a proton.

The tutor's computer program to mark student coursework, the life of Elizabeth Bennett, a biography of Jane Austen, are each nameable objects, apparently packageable up, like a hologram, into a physical object (a CD-ROM, a deck of punched cards, a book, a DVD, a reel of cine film). The major lesson of Turing's halting problem is that the computer program to mark student coursework cannot be treated as a stand-alone object; it is necessary to state the specific data that are to be supplied, and to consider the internal workings of its execution (passing under the read-head of the program counter). By implication, then, neither can a story be treated in the abstract, as a static whole, in a book, DVD or stack of cine frames; it is necessary to consider the specific data that it will be immersed in, and to consider the dynamics of how the story is to be told (passing under the read-head of the page-turner, DVD player or cine projector). When those stories are not just biographies, but human lives, the read-head is the moment that we call Now, and is perhaps related to how we can only think consciously in a single sequential thread.

Just as the program counter of the imperative language instruction processor unit can be distributed in a lambda calculus execution unit (and even simply in out-of-order execution) the moment that we call Now has multiple, local instantiations. The rotation of our planet forces each one of us to run a "get up, get dressed, perform bodily functions, get undressed, go back to bed" algorithm; the orbit and tilt of our planet forces each one of us to run a "sort out winter wardrobe, sort out summer wardrobe" algorithm. We tend to be aware of how far we are along each algorithm, and be able to discuss it with those around us.

The block view of eternalism, as distinct from the instantaneous view of presentism (NS, 03-Jun-2017, p44), effectively considers time to be just like another spatial dimension, w, and hence as one of four dimensions (w,x,y,z) in a block. In effect, the block view imagines all instants in time to be entangled (involving convolution with the delta distribution, for example) while the instantaneous view imagines taking a measurement of the clock itself, thereby collapsing out the state of each frame of the cine film. We can imagine it being like a reel of celluloid movie film that has been cut up into its individual frames, and piled up in sequence. In this 3D-model, the frames represent just two spatial dimensions, x and z, with the position in the pile, w, used to represent time (with the third spatial dimension, y, implied by the use of perspective within the frame). Though it is really the paths of the constituent fundamental particles that need to be traced, as they are not only atomic but dimensionless, it is convenient to think in terms of the movements of macro objects. The movement of a sugar cube can be traced as it first enters into the scene in a sugar bowl, then is lifted and dropped into a cup of tea, and similarly for a human, as a sort of "worm", with the baby at one end, and the corpse at the other (NS, 02-Nov-2013, p34) whose thoughts would be manifest as positional relationships in the static structure (NS, 22-Nov-2008, p32) in an analogous way to structures within an oil painting.

Convergent processes, including program execution, dissipate information, as captured by Landauer's principle, that memory and computation must necessarily involve an increase in entropy, and hence are within the realm of the second law of thermodynamics, and the feeling of the flow of time. Symmetry-breaking divergent processes create information, but only ever make a fleeting appearance in our universe (the pencil, balanced on its point, ended up falling this way, or that way) before convergent processes take over again (once the pencil starts to fall, there is more reason for it continue in the same direction). At the bifurcation decision points, new bits of information are required; running the collision backwards still results in the collision, but now convergent and unconditional. Emerging from a period where there is no Taylor expansion involves breaking symmetry, and the gaining of one bit of information, relative to the otherwise undisturbed path, to record which branch of the bifurcation was taken.

Useful as a tool of speculation

A matter handler (such as a chemical reaction vessel) is limited by Heisenberg's Uncertainty Principle. Neither the uncertainty of the momentum nor that of the distance can be zero. By implication, we can also predict that neither the momentum nor the distance can be zero, either.

Similarly, an energy handler (such as a heat engine) is limited by Heisenberg's Uncertainty Principle. Neither the uncertainty of the amount of energy nor that of the amount of time can be zero. By implication, neither the amount of energy nor the amount of time can be zero, either. This then implies the spontaneous presence of energy in the quantum vacuum (albeit 120 orders of magnitude out, in its prediction), as a result of Heisenberg's Uncertainty Principle, and hence of the second law of thermodynamics.

An entropy handler, too, is limited by Heisenberg's Uncertainty Principle. Neither the uncertainty of the entropy, nor that of the temperature, nor that of the amount of time can be zero. By implication, neither the entropy, nor the temperature, nor the amount of time can be zero, either. This then implies the third law of thermodynamics as a result of Heisenberg's Uncertainty Principle, and hence of the second law of thermodynamics.

An information handler (such as a computer) is limited by a generalisation of Turing's Halting Problem. Neither the uncertainty of computability nor that of the execution time can be zero. By implication, neither the computability nor the execution time can be zero, either. Turing machines are limited more by the uncertainity on the execution time, while evolutionary systems and perhaps quantum computers are able to find their way past the Turing Halting Problem by sacraficing determinism about the computability.

A deductive logic handler is limited by Godel's Incompleteness Theorem. Neither the uncertainty of completeness nor that of the consistency, nor that of the scope can be zero. By implication, neither the completeness, nor the consistency nor the scope can be zero, either.

It has been suggested that the long-sought theory of everything might simply come from suitably adapting the laws of thermodynamics (NS, 13-Oct-2012, p32). Indeed, the laws of thermodynamics are more like meta laws, talking about how to construct other physical laws, rather than about physical systems (NS, 17-Apr-2021, p34). A possible process for generating a Theory of Everything, or at least of unifying the various disciplines, one after another, might involve the following steps:

  1. Find a law that states a fundamental limit in the given subject
  2. Show how, if it did not hold, the system could be used to build a perpetual motion machine.

Heisenberg’s Uncertainty Principle, coupled with Noether, also suggests another guideline:

  1. Find how to frame the limitation law as a symmetry
  2. Fit it into a suitable set of Hamiltonian equations
  3. Infer the corresponding conservation law.

It can be shown that Heisenberg’s Uncertainty Principle can be reframed in information theory terms (NS, 23-Jun-2012, p8), with the momentum of the particle as one message stream, and its position as another, and then questions can be asked about decoding the two message streams (Maxwell's demon is presumably connected to this). It shows that if Heisenberg’s Uncertainty Principle were relaxed, then the second law of thermodynamics would be violated, and hence that Heisenberg’s Uncertainty Principle follows from the second law of thermodynamics. Consequently, if you did manage to find out an electron's position as well as its momentum, you would have so much information that you would be able to make a perpetual motion machine from it.

The ideal of a disembodied consciousness has parallels with thermodynamics, and the ideal case of a closed system. A perfectly closed system cannot exist, since we cannot look inside it to observe it (information cannot leak out, and the energy of our probe cannot enter in). Moreover, a perfectly reversible Carnot engine, even if it could exist, can only work if it takes infinitely long to complete its cycle. Even so, the concept of the closed system is an important ideal, on which the theories of thermodynamics can be built. This also hints a parallels with Turing decideability, in the realm of computing engines. The concept of dis-embodied consciousness could still end up as the corner-stone of the theory behind consciousness.

Since Heisenberg's Uncertainty Principle has been demonstrated to result from the second law of thermodynamics, doing something similar for general relativity would then mean that a common link would have been found between it and quantum mechanics.

Special relativity follows from the Principle of Relativity (constant motion of the system cannot be determined by observations completely inside the system), and the Relativity of Simultaneity (simultaneous events in one context are not simultaneous in another).

General relativity follows from generalising the Principle of Relativity still further (momentum mass and gravitational mass are indistinguishable, and exactly equal; therefore, even though acceleration can be felt from within the system, since it does not constitute constant motion, it cannot be distinguished from a gravitational field, just by observations completely inside the system).

The two component terms of the the Turing halting problem ought to be conjugate, with one based on the other's integral. This also implies that the product of the two terms should have the units of action, probably involving a scaling by ln(2) (to convert bits to nats) and by Boltzmann's constant to bring in the correct units. This would probably require an adjustment to allow for the digital world to be modelled as if it had analogue behaviour. One intriguing idea is to take Floyd-Hoare's inductive assertions, or similar techniques, perhaps smoothed out over partial cycles, and to see if it can be used to cancel out internal movements of data during a cycle of an algorithm, in the same way that Carnot did for internal motions within a mechanical engine, and to see how this might tie in with Bennett's reversible computing (Sci.Amer, Jul-1985, p38).

Noether's theorem confirms that the two parameters in Heisenberg's uncertainty principle are connected (the conservation law of one follows as a consequence of the symmetry exhibited by the other) and hence that the two parameters are just two sides of the same coin, so are described by one shared set of information, not two. Heisenberg, Noether, Fourier and Bell all point in the same direction, that it is not just that we do not have access to all the information, but that the information simply does not exist in the first place (for example of a particle's position and of its momentum) and that our observations can only ever be probabilistic (NS, 14-Mar-2015, p28). A particle cannot be pin-pointed, at sub-atomic scales, in phase-space (x,y,z,ẋ,ẏ,ż) because the particle has only a blurred location in that space, giving us limited ability for dead-reckoning to extrapolate or to interpolate a particle's position. This happens whenever we try to find partial information about an entangled system, and perhaps indicates that quantum weirdness simply emerges from more logical central principles (NS, 11-Apr-2015, p34).

Power is the rate of change of energy in time, and, by the same token, force is the rate of change of energy in distance; but force is also rate of change of momentum in time.

The left-hand lattice has been arranged so that by starting at any given node, travelling southwest involves taking the d/dt, travelling southeast involves taking the d/dx (and vice versa with integrals for travelling northeast and northwest). Travelling horizontally, right to left, involves multiplying by dx/dt.

        
 Z 
 / \ 
 Lmx 
 / \/ \ 
 Epm 
 / \/ \/ \ 
PFm/tm/x
         
        
 f 
 / \ 
 vθ 
 / \/ \ 
 xvxt 
 / \/ \/ \ 
V/tAxt t2

The right-hand lattice is the mirror image of the left-hand one. P=power, E=energy, F=force, p=momentum, m=mass, L=angular momentum, θ=angle, mx=mass*distance (the conjugate pair for velocity), A=area (the pair of rate of change of mass), and Z=∫L.dt=∫m.x.dx (the pair of frequency).

Having such a conjugate pair, this would then imply that neither of the terms can be reduced to zero, since that would be a value with zero uncertainty; but then we knew that already for execution time, and for the computation information temperature it probably simply implies a generalisation of the third law of thermodynamics. After that, we might expect the symmetry in one term to imply a conservation law in the other (Neuenschwander 2011). We would have to work out how this maps on to the notions of execution time and computation information temperature. Then, the way that one term is derived from the Fourier transform of the other would imply a wave interpretation of the system; we would have to work out what would be the implications of this.

Going back to Maxwell's equations and Kirchhoff's laws, the duality between the realm of voltage sources, impedences and electric fields, versus that of current sources, admittances and magnetic fields, suggests, perhaps, that the roles of symmetry and conserved quantities can be likewise interchanged, leading to a duality between symmetry and the conserved quantity. The overall conserved quantity appears, when viewed externally, to exhibit a symmetry, even though internally it contains a dynamic system.

Lastly, one idea that is of particular interest to the author would be to take an implementation of a distributed and asynchronous program counter (Shute 1983) and to consider what parallels might be drawn with our slippery notion of Now.

On to the next chapter or back to the table of contents.

Top of this page Home page Services Past achievements Contact Site
map
Page d'accueil Services Réalisations précédentes Contact
© Malcolm Shute, Valley d'Aigues Research, 2006-2024