![]() |
English | ||||||
Home page | Services | Past achievements | Contact | Site map |
|||
---|---|---|---|---|---|---|---|
Page d'accueil | Services | Réalisations précédentes | Contact | ||||
Français | |||||||
|
A chemical reaction is a matter handler, taking in matter (the reagents), processing it (the reaction itself), storing it in reservoirs, but also in the delay-line capacitance of the pipes (and barges on canals), and outputting the result (the reaction products). |
![]() |
A heat-engine or transducer is an energy handler, taking in energy via the fuel inlet, processing it (converting chemical energy to kinetic) perhaps as a two-stage process in the fire box and then the cylinder, storing it in the fly-wheel and boiler, and outputting the result via the drive shaft.
A computer or logic gate is a syntactic information handler, taking in syntactic information through the resistance buffering, processing it in a cross-coupled transistor pair, storing it in the parasitic capacitance of all the internal connections, and outputting the result in the amplifier chain. Input, output and storage of source and object codes, while keeping the semantics unchanged; processing using rewrite-rules (and pedagogical pruning of the message for the assembled audience).
So, by extension, we could imagine a semantics handler, taking in, processing, storing, and outputting the result. Original and final allegorical or analogous stories, in which the semantics have been changed, while keeping the empathy unchanged.
All structure is temporary, and will eventually break up. A chair is in a flux of exchanging its surface molecules. Structured objects are just a convenient modularisation in the mind of the human observer. By this argument, even the proton has a half-life, albeit possibly longer than the lifetime of the universe. To be considered as a structure, though, there must be some persistence mechanism at work, like a potential energy well, pulling the components back together, until one day chaotic behaviour, or a freak event, sends one of them flying over the rim of the well. It is in this way that digital systems are chaotic systems, albeit ones that are kept from rarely going over the rim.
Each structure serves as a stable platform on which further structures can establish themselves (NS, 27-Apr-2019, p53), either applying restoring forces, or adding further structure to the previously already stable platform. Stable patterns bootstrapping up, built on previous stable platforms, is to bottom-up design, as divide and conquer is to top-down design (NS, 10-May-2008, p52) but with the added notion of emergent behaviour. For the human observer, this is partly abstraction to allow us to handle the dimension of increasing complexity (NS, 14-Feb-2009, p36). However, it also genuinely describes systems with different laws, a top-down causation (NS, 17-Aug-2013, p28). This is even apparent at the classical level. For reasons of abstraction, we give labels to classifiable objects (chairs, rivers, human beings) wherever there is a convergence, establishing and maintaining those groupings. A river is an identifiable object, whose positions and movements over several decades can be predicted, given a knowledge of the geology and geography of the region, but a detailed knowledge of the position and velocity of each and every one of the 1038 (or so) water molecules over the decades would not improve the model. Although the generic physical and chemical properties of water molecules do need to be taken into account, we are used to the river itself being a stand-alone object that does not need to be reduced to a description of the positions and movements within the lower-level. There truly is an emergent behaviour going in the bottom-up direction that thwarts any attempts at reductionism in the top-down direction. The process of reductionism fails whenever the turn-over of component particles can be substituted interchangeably or merged into and out of the background (through evaporation and condensation, for example). It can be noted that the human body, too, consists of a continuous turn-over or flux of atoms and molecules (NS, 12-Dec-2020, p36).
As a cue ball travels across a snooker table, and strikes a glancing blow on a red ball (where, for example, the cue ball could represent a molecule of hot gas, and the red ball could represent an atom of a metal piston) the collision transfers some of its energy and momentum to the red ball. But if the cue ball undergoes this when striking the red ball, it must also be undergoing countless mini-collisions with the molecules of the baize and the surrounding air as it traverses the table (and similarly the red ball, once it too is in motion). Moreover, at the moment of the collision, there is a flurry of internal collisions between the neighbouring atoms within each ball, given that neither is a perfectly rigid body. This is followed by a divergent cascade of collision events between these newly excited molecules with their own next nearest neighbours. Every one of these collisions is, in theory, perfectly reversible. However, in any attempt to reverse the whole process, we would naturally focus on reversing the main collision, that of the red and white ball, and getting the original energy back from that. It would be beyond our computational ability to arrange for each of the mini-collisions to be convergently reversed. So, instead, due to our own pragmatic laziness, we let them be substituted by a roughly equal number of new divergent forward collisions, thereby dissipating a proportion of yet more of the balls' energies. As a result, the collision will not be perfectly reversed, and we will not get back all the energy that we originally put in.
Even in an endothermic chemical reaction, when the two molecules collide, the newly-formed chemical bonds take up the kinetic energy, which makes that region of the substance appear cooler, so heat energy from the rest of the substance dissipates divergently, as normal, into that cooler region. So it is still divergence at work, not convergence (localised on a small region, not focused on it).
The second law of thermodynamics arises in the human mind, as a consequence of the modular modeling process. While the module is modeled with bulk parameters, assuming a statistical approach, a degree of uncertainty due to the statistical averaging process, is inevitably involved. It makes a difference, to the observer, when the events are free to range anonymously over the objects, as in queuing theory for example.
If each event could be tracked individually, the averaging would be replaced by deterministic certainty. Determinism requires the principle of causality to hold (that every effect has a prior cause). This is the principle of unitarity, and requires there to be a conservation law for information, and leads, amongst other things, to the black-hole firewall paradox. The measurement problem is another consequence: the quantum system is described exactly by the wave-equation, until the boundary point at which we consider the atoms to belong to the probing measurement device, and for non-quantum, classical behaviour to start.
The process of decoherence can either be looked on as a steady spreading of the extent of the quantum system, to include more and more atoms into the wave-equations, or else as the dissipative leaking of energy and information out to atoms that are considered to be outside the quantum system. It amounts to the same thing, but the former view can be considered the realm of the reactive spreading, with imaginary components in the exponential function, and the latter to be the realm of resistive dissipation with real components in the exponential function. The latter view dismisses the dissipation as losses, that are no longer tracked in the wave-equation, but are treated statistically as a bulk parameter. It is the realm where the second law of thermodynamics starts to have application; in the reactive view, it had no application. It can also be thought of as the change from science as the viewpoint, to engineering: from detailed tracking of each component, to considering regions as having bulked black-box behaviour that can be used as components of further systems. Decoherence and energy dissipation appear to be the same mechanism, according to this snooker-ball view. Decoherence is just another mechanism for cooling, like conduction, convection and radiation, with a corresponding cooling law, and the idea of driving a Crooks' engine from the mechanism.
Whenever entanglement is increasing (noting that decoherence is entanglement with the individual constituents of the environment) entropy is increasing and, by the corrolary of the second law of thermodynamics, energy necessarily flows. Constantly milling around, redistributing positions and velocities, with no new configuration significantly different from any other, each one, including the initial starting position which is now long destroyed, unlikely ever to occur again at random. Decoherence is a divergent process, like heat dissipation, caused by information dissipating through entanglement events throughout the environment. Jostling and collisions and absolute zero (NS, 18-Mar-2017, p10).
What is really meant in quantum mechanics by the terms 'observation' and 'measurement' is "confined to an eigenstate" and is similar to the way that any sandy beach will passively interact with the in-coming waves, and cause them to break, with or without any conscious mind being present; or a tossed six-sided die will have its probabilities redistributed by encountering a table surface. The coherent part of the interaction is time reversible, and the incoherent part is irreversible; it is the latter that is considered to contitute an observation or measurement, and which results in the collapse of the wave-function. The extra term that objective collapse theorists add to the Schrödinger equation, to account for the random component of the collapse event, comes more naturally from the decoherence/back-action term (NS, 28-Mar-2020, p34) thereby involving the whole system and its immediate environment in the equation (the measurement probe device, of course, being made of quantum particles, too).
Eyes sense the amplitude of the incoming waves, but ignore the phase information; similarly ears, with the amplitude and phase information of sound waves; similarly, the process of wave-function collapse, using the Born rule to note the amplitude information, but discard the phase information.
It is interesting that the words, "maintaining", "persisting", "converging" keep coming up together, and could be the entry-point for some type of conservation law. For a semantics handler, it would be the dissipation of meaning or semantics that needs to be controlled, not allowing the notion of self to be difused into all the surrounding objects that it encounters. Perhaps it is this that distinguishes conscious humans from other organisms. Cats, worms, sunflowers, seem to have been aware of the passage of time, the sensation of temperature, and hence implicitly to be aware of the second law of thermodynamics. However, none of them seem to worry about the meaning of it all, they just get on with the day-to-day process of sustaining life; worrying only about the present, and where the next nutrious input is going to come from (and the next predator avoided) with no concerns over what life is all about. Even our species has slotted into this bottom-up approach, with only the occasional philosopher nobleman able to afford the luxury of wondering about such matters. But still without coming back with any definitive answers. Just like the patchwork document might wonder what it has been created for, and how it should be shaping up to meet those needs (the problem, of course, being that its author has not worked that one out either). Moreover, none of them indulge in composing music. (However, it is interesting that in his ninth symphony, Beethoven felt he needed the words of language to make the points explicit, and that music was not sufficient on its own).
Perhaps all program creation can be viewed as a type of sort process. Indeed, an engineering student in 1975 could consider maintaining a stock of commonly used punched cards, for editing program decks over the weekend, for submission before a deadline on Monday morning (though this never really worked in practice). Writing a computer program, therefore, can be imagined as a type of sort operation on an indefinitely large deck of punch cards, with the wanted cards sorted before the "STOP" and "END" partition, and the remainder left unsorted after it, like junk DNA.
Many problems in engineering seem to be that of judging the level of granularity, and the subjective input required to choose the cutoff point. A good example is the design of a Phase-Locked Loop. This is a simple electronic circuit that latches on to the incoming oscillating signal, despite any noise that is present. It includes a low-pass filter, to allow it to smooth over any periods of excessive noise. But who decides the cutoff frequency of that low pass filter? And, therefore, who decides what part of the input is signal, and which part is noise? One man's noise is another man's Ph.D.. Problems about judging between coarse grain structure, and fine grain randomness, between signal and noise, between call-by-value versus call-by-reference (including, in modular document maintenance), therefore ends up as being a manifestation of the Turing Halting Problem (and not resolveable in a generic, waving of a magic wand sort of a way). However, this emphasises that this manifestation of the Turing halting problem is a human, top-down designer artifact; natural evolution is usually less concerned by decisions to use call-by-reference or call-by-value.
Latency, throughput and Kirchhoff has led to discussions of dissipation of molecules, dissipation of energy, dissipation of information, dissipation in a BM or EM.
Einstein's 1905 paper on Special Relativity showed that energy is mass, and mass is energy. However, matter is distinct from both of these. Indeed, energy is not a thing that one can point to; it is only a property, or measurable parameter, of a thing. Similarly, mass is just a property (a measureable parameter) of matter. Viewed from inside the object, we see components moving around with kinetic energy; viewed from outside the object (orbiting it in some notional Starship Enterprise, for example) we just perceive its total mass. A good example of this is the proton (and likewise the neutron). Only 1% of its rest mass is accounted for by totalling the rest mass of its component particles; the other 99% is how we perceive their energy.
Discussions on wave-particle handlers, information handlers, semanitics handlers, positive-negative asymmetry, and ideas towards a theory of quantum gravity.
We think of particles as being pointlike and interacting locally, while waves have their information smeared out non-locally and holographically (at the very least throughout their wavelength). For quantum behaviour, there can be non-local phenomena (both spatially, and temporally, including retrospectively). As we shrink the scale down from snooker balls to quantum particles, the wave-particle distinction starts to blur, and any point-like particle must have a vague uncertainty to it.
The wave-particle duality has echos of the distinction between continuous and discrete (quantised), and Heisenberg's uncertainty principle and Gödel's incompleteness theorem have a seductively similar feel to them. The single-shot pulse displays the lowest-level appearance of modularity (one side of the leading edge is part of the pulse, the other is not).
Fourier analysis gives Δk.Δx≥1 as an inherent limitation, where k is the wavenumber (the spatial frequency) equal to 2π/λ: the sharper and more precise a pulse is in time, or laid out in space, the wider the band of frequencies needed to define its sharp edges; conversely, the more discerning a spectrum analyser is, the longer the sample needs to be to categorise the wave. Then, with just one result from quantum mechanics, E=hf, Heisenberg's uncertainty principle drops out: ΔE.Δt≥ℏ/2. Within quantum mechanics, this result conveys the discrepancy between measuring one parameter of the conjugate pair (the energy of the system, say) before the other (the time), compared hypothetically to having done it the other way round.
Gödel's incompleteness theorem, or equivalent (NS, 14-Aug-2010, p34) leads to the Turing Halting Problem for computation. The universal Turing machine was constructed as a means of illustrating what undecidability might look like in the Entsheidungsproblem.
Heisenberg's uncertainty principle can be expressed in terms of information theory (NS, 23-Jun-2012, p8) with the momentum of the particle conveyed in one message stream, and its position in another, and noting that being able to decode both message streams would yield so much information that it would be tantamount to violating the second law of thermodynamics.
Just as with Gödel's incompleteness theorem, quantum physics is unable to describe itself (NS, 23-Mar-2019, p28) for example in the measurement problem and an extension to the Wigner's friend thought experiment.
Omnes (1990) notes that if the universe is to be treated as an information system, we must first establish its basis in logic; Heisenberg's uncertainty principle would then drop out from Gödel's incompleteness theorem (the logic that leads to a measurement of the momentum of a particle, contains no statements to describe its position, so any statements about its position can neither be proved nor disproved).
The parameters in Bell's inequality are not normal commutative numbers (NS, 03-Nov-2007, p36) but should be handled as operators, or as octonion and quaternion numbers (NS, 09-Nov-2002, p30) or as Dirac's non-commuting quantities (q-numbers) not all of which can be simultaneously number-valued (such as the eigenstate for an electron's position, and the one for its momentum).
Fractals might also be used to explain how the opposing views of quantum mechanics and relativity might both be correct (NS, 28-Mar-2009, p37) and how some Gödel-like questions about the universe (such as, "what if the experiment had measured the momentum of the particle first instead of its position") might have no answer because they do not lie on the same fractal coastline of some sort of scale-relativity universe (NS, 10-Mar-2007, p30).
Superposition versus unknown position for matter becomes inconsistency versus incompleteness for deductive logic information systems. Believing both things versus not knowing what to believe: being in two minds on or have no opinion on the given subject. Like a wave-particle that is in a superposition of two (or more) states, and has none of these states defined between the end points of its trajectory.
Waves carry, and eventually disperse, energy. They also carry, and disperse, information (not least in the process of decoherence). Do they also carry, and disperse, some other content corresponding to BM or EM, such as semantics?
signal ->- noise ->- playhead ->- on to other receivers signal ->- noise -><- read/write head of FSM -><- on to other receivers signal ->- noise -><- read/write head of FSM1 -><- read/write head of FSM2 -><- read/write head of FSM3 -><- on to other receivers
Can this be adapted to represent the blackboard model?
The crank-shaft of an internal combustion engine is there to carry power; the cam-shaft is more like the mechanics of a clock or analytical engine, and is there to carry information (this needs to be merged with the discussion in chapter 0). It has, though, been expressly designed to carry meaningful information (semantics): not ad hoc information, but grounded information that is self-consistent with the running of the machine. A machine requires there to be a hot bath (a region of abundant energy) and a cold bath (a region of dearth), and a channel through the barrier that separates them, with a mechanism analogous to a water-wheel placed in the channel. In an exothermic chemical reaction, the barrier is an energy threshold, and the channel might be a catalyst. The apparatus of a man-made machine, such as a cylinder and piston arrangement, can therefore be viewed, too, as a type of catalyst (NS, 24-May-2014, p30). The convergent constraints imposed can be quantified as information, measured by Shannon entropy, and is what distinguishes work from energy (NS, 12-Aug-2017, p40). In this way, information, too, is what makes the difference between living and non-living matter (NS, 02-Feb-2019, p28) with feedback loops establishing themselves between the software domain of Shannon entropy and the hardware world of molecules and thermodynamics.
There are pairwise limitations, similar to those in the Heisenberg uncertainty principle, in many subjects. Even for theories in philosophy, Berkeley notes that they cannot be both constant and consistent, and that they cannot be both completely informative and completely certain. Other examples include risk versus cost, and temperature gradient versus energy flow. There is a possibility that this could be hinting at a possible connection between Turing's halting problem to Heisenberg's uncertainty principle. It suggests how a more general form of the Turing halting problem might be proposed: such as the uncertainty of the execution time, balanced against the uncertainty in the computation-information temperature. The computation-information temperature could be that of the simulated annealing system, or the mutation jump distance of genetic algorithms. In the case of von Neumann architectures, the temperature, though still finite when asynchronous logic bit-flips are taken into account, is brought down close to zero, hence the enormous uncertainty in execution time caused by the halting problem.
Moreover, there are many parameters in physics that are unable to take on negative values: the magnitude of a vector; the dot-product of a vector with itself, as in "½m.(v.v)"; the amount of energy or mass; the frequency of a wave, the pressure, volume, temperature, number of moles of substance, amount of substance being monitored in a clock (water, sand, chord supporting a weight), overall entropy change, and time. In the case of special relativity, it is the √(c2-v2) denominator of the Lorentz term that cannot approach zero (though negative values are not obviously forbidden). These are constrained to being positive quantities, but position and momentum in the three spatial axes can take on negative values (or positive values but pointing in the opposite direction), relative to some arbitrary origin.
This is one source of non-commutivity, or at least practical asymmetry: adding fuel to a near-empty fuel tank, just before making a journey, is arithmetically the same as adding the same amount of fuel just immediately after the journey, except for the fact that the journey could not be continued with negative levels of fuel in the tank. Non-commutivity also arises when there are side-effects involved. Asymmetry in some operators can also lead to non-commutivity in the sequential quadrant rotations of an object about its x, y and z axes. There is also the asymmetry of time dilation in general relativity, as opposed to the symmetrical view taken of it in special relativity and Newtonian physics. Another example is given by the second law of thermodynamics. The laws of cooling end up with a built-in double negative to impede energy flowing from cool bodies towards hotter ones.
Parameters that cannot take on negative values, cannot approach zero, either. When a parameter has no valid meaning defined for it for negative values, it suggests that the parameter should be viewed logarithmically, ranging from 10-p to 10+p. Indeed, each report of a new world record in temperatures achieved in the laboratory (highest or lowest) is reported using such 10+pK or 10-pK notation. When a given parameter cannot take on a negative value, any means that we take must be geometric means not arithmetic ones (that is, they are arithmetic means of the logarithmic parameter). One possible implication is that our linear scale for measuring time is arbitrary, too, and merely a human invention, and should be measured on a logarithmic scale, just as we are happy to do when measuring pH. Indeed, cosmologists tend to be interested in the positive time before 10-43s, and that after 10+60s. Negative values of linear time would then be just as imaginary as a point that is north of the north pole. If this were the case, though, our physics experiments and Noether's theorem should have shown it up. However, in all our real world values, 't' is just a miniscule deviation either side of the 13.8 billion year base in the Taylor expansion ln(1060)+ln(1+t), and perhaps might explain why we think of the time dimension looking just-about linear.
There is also a built in bias in our definitions of those parameters in physics. We think of acceleration as being due to an applied force, but this means that where there is an acceleration (due to time dilation for example) it automatically appears to us to be due to a force (gravity in this case). Similarly, a de facto expansion of space-time appears to us as a velocity, and its acceleration as being rooted in a dark energy, and as noted above, regions of increasing entropy must therefore imply an energy flow.
Having chosen to define temperature on a linear scale, the Carnot efficiency of a machine drops out as (Thot-Tcool)/(Thot-Tabszero). This can be read as the ratio of the energy that we managed to extract, divided by the total energy that we might have aspired to extracting. It is notable that both quantum physics and general relativity agree that Tabszero is unattainable: the particles inside the helium balloon cannot have an internal velocity that drops to zero since that would involve all of them flying along together at precisely the same external velocity, like a squadron of 6x1023 fighter jets. Quantum physics does not allow the uncertainty of their momenta to drop exactly to zero, and general relativity does not allow mass-bearing particles to travel along exactly parallel lines in each others' proximity. It is tempting, therefore, to wonder if, in a future quantum theory of gravity, these two observations could turn out to be just two sides of the same phenomenon. Indeed, Penrose notes how the uncertainty of the position of a massive particle translates to an uncertainty on the curvature of space-time [New Scientist, 09-Mar-2002, p26] and that it is perhaps this non-linearity that makes superposition in our large-scale quantum devices difficult to sustain.
Moreover, the expression for inertial mass in relativity, E/c=√((m.c)2+p2), also has parallels with the uncertainty principle in quantum mechanics (either the rest mass, or the momentum can be zero, but not both; or, at least, if they are both zero, the object has no effect in the universe, and cannot be considered to exist). This usually manifests from the opposite direction: if a particle has rest mass, it cannot be accelerated to the asymtotic limit of the speed of light in a vacuum, but if that particle has zero rest mass, it can only travel at the vacuum speed of light (else it would have no detectable effect on the universe, and hence not exist).
For quantum mechanics, the uncertainties of the two conjugate parameters, Δp and Δq, cannot be less than a given value, though either one can be squeezed arbitrarilly close to zero (NS, 30-Apr-2011, p28). As a consequence, neither absolute parameter, p or q, can be exactly zero, either; which can be expressed as p.q-q.p=-iℏ. but for relativity it is directly for the values of the parameters themselves, m and p.
Multiplication is akin to the Boolean And (as exploited in the C programming language).
For the classical view, E=PV=nRT means that molecular mass cannot be negative. For Special Relativity, E2=(mc2)2+(pc)2 means that inertial mass cannot be negative. For Newtonian gravity (and later for General Relativity) F=-G.m1.m2/(r.r) (or a more generalised form) and so any negative gravitational masses in our vicinity would have been repelled away, long ago. For quantum mechanics, there can be no negative masses, since the vacuum would then be unstable, and able to decay to particles with negative mass.
Principles of uncertainty and incompleteness has led to discussions of wave-particle handlers, information handlers, semanitics handlers, positive-negative asymmetry, and ideas towards a theory of quantum gravity.
The way that the wave-function of a photon or an electron consists of a carrier wave (eikx) multiplied by an envelope (e–a.x2) raises one way of viewing how reality is structured, with multiple levels of modulation. In human communication, we have the RF carrier wave modulated by an AF tone, and the AF tone modulated by the irregular crystal structure of Morse or ASCII code (analysable in terms of Shannon entropy) which in turn is modulated by the structure of the text that is being sent (such as that of one of Shakespeare's plays). This might then be broken down into multiple layers of structure, meaning and artistic value. In all these examples, though, the climax (of a symphony, or of a play, for example) can only be appreciated by first experiencing the rest of the work. In the other direction, St. Augustin noted the relationship between hearing the current note of a phrase, within a tune, and the memory of the melody up to that point, and the anticipation of the patterns to come. This is related to the concept of Now, and the need to live out the moments out one by one. (Life is a mystery to be lived, not a problem to be resolved.)
Dancing round the Maypole, Q&A phrases, and the part played by the repeat and DC signs. Scene, and unexpected decisions.
Relativity and quantum mechanics handle the concept of time in two different ways, but still neither has a satisfactory way of explaining what constitutes our feeling of the moment that we call, Now. Art seems to be important to humans, and hints at having something to say on the issue (a finger that points at the word being read, the note being played, an area in the painting that is being focused on, and the line of a computer program being executed). Indeed, science is very related to art, as Da Vinci and Galileo took for granted, and very much rooted in the way that the human mind does its thinking through symbols, monologue, and framing of stories (even a scientific paper in a journal is written in the introduction-body-conclusion form). Our dialogue with other human beings, even those sent off to explore the Moon, are all less than a light-second away; so, we are used to Now being wafer-thin. (However, this would break down if we attempted to embark on a dialogue with a being located in the Andromeda galaxy). Considering the moment of Now to be the resolution of the similtaneous equations of "You are here" for the participating parties.
The effect of parallax gives us our sense of depth (whether by long base) line movement, binocular difference, of differential focus across the radius of a lens). When one of the dimensions is the time axis, the effect is experienced differently. When Alice and Bob synchronise their watches, those watches become parts of a single system. Alice travels to her experiment, performs the experiment, and sends the results (by one means or another) back to Bob. Bob perceives the differences in the time-line taken by the two watches, and that they subtend a different angle to Bob's gaze. He might notice how much less Alice has aged (or conversely how much more she has matured) in arriving at the mutual meeting point for the results (in this case comparing biological clocks rather then wrist-watches).
The distinction between the book "Pride and Prejudice" as a whole, and the current placement of the bookmark in my personal copy; or the BubbleSort algorithm versus its current position in the sorting process. A computer program, implemented in software, can be viewed as an emulation of real (albeit perhaps exponentially complex) hardware. As such, it has a black-box behaviour: we present it with inputs, and expect it to reply with outputs. We know there will be a delay between the inputs and the outputs: as well as the speed of light limitation between the two sets of ports, we do also allow for an execution time, but we do not intuitively allow for Turing's halting problem. Perhaps the execution time could be modelled as a half-life, with some algorithms halting earlier than expected, and others having infinitely longer lives than the half-life of a proton.
The tutor's computer program to mark student coursework, the life of Elizabeth Bennett, a biography of Jane Austen, are each nameable objects, apparently packageable up, like a hologram, into a physical object (a CD-ROM, a deck of punched cards, a book, a DVD, a reel of cine film). The major lesson of Turing's halting problem is that the computer program to mark student coursework cannot be treated as a stand-alone object; it is necessary to state the specific data that are to be supplied, and to consider the internal workings of its execution (passing under the read-head of the program counter). By implication, then, neither can a story be treated in the abstract, as a static whole, in a book, DVD or stack of cine frames; it is necessary to consider the specific data that it will be immersed in, and to consider the dynamics of how the story is to be told (passing under the read-head of the page-turner, DVD player or cine projector). When those stories are not just biographies, but human lives, the read-head is the moment that we call Now, and is perhaps related to how we can only think consciously in a single sequential thread.
Errors in a table for a differences engine manifest as an ever-growing triangle-shape of further errors, extending off to an infinite number of the later columns, and building diagonally both to future rows, and retrospectively to past rows.
Reading the landscape, when looking out of the window while on a long journey, with the potential to stop off with a visit to any of the points of interest on the way, is akin to reading a book, and choosing to follow up on each of the footnotes; or to read up on the subject of a Wikipedia article, and to click on those hyperlinks that are considered interesting to the reader. Perhaps there are parallels with a conversation with a newly introduced acquaintance. The Apollo-11 landing site does not have a position in a traditional earth-based (longitude, latitude, altitude) coordinate system, and Alice and Bob trace out different time-lines when the enact one of the twins paradoxes of relativity.
Just as the program counter of the imperative language instruction processor unit can be distributed in a lambda calculus execution unit (and even in conventional out-of-order execution) the moment that we call Now has multiple, local instantiations. The rotation of our planet forces each one of us to run a "get up, get dressed, perform bodily functions, get undressed, go back to bed" algorithm; the orbit and tilt of our planet forces each one of us to run a "sort out winter equipment, sort out summer equipment" algorithm. We tend to be aware of how far we are along the time-line of each algorithm, and be able to meet up and to discuss it with those around us.
The block view of eternalism, as distinct from the instantaneous view of presentism (NS, 03-Jun-2017, p44), effectively considers time to be just like another spatial dimension, w, and hence as one of four dimensions (w,x,y,z) in a block. In effect, the block view imagines all instants in time to be entangled (involving convolution with the delta distribution, for example) while the instantaneous view imagines taking a measurement of the clock itself, thereby collapsing out the state of each frame of the cine film. We can imagine it being like a reel of celluloid movie film that has been cut up into its individual frames, and piled up in sequence. In this 3D-model, the frames represent just two spatial dimensions, x and z, with the position in the pile, w, used to represent time (with the third spatial dimension, y, implied by the use of perspective within the frame, the residual effect of parallax). Though it is really the paths of the constituent fundamental particles that need to be traced, as they are not only atomic but dimensionless, it is convenient to think in terms of the movements of macro objects. The movement of a sugar cube can be traced as it first enters into the scene in a sugar bowl, then is lifted and dropped into a cup of tea, and similarly for a human, as a sort of "worm", with the baby at one end, and the corpse at the other (NS, 02-Nov-2013, p34) whose thoughts would be manifest as positional relationships in the static structure (NS, 22-Nov-2008, p32) in an analogous way to structures within an oil painting.
Convergent processes, including program execution, dissipate information, as captured by Landauer's principle, that memory and computation must necessarily involve an increase in entropy, and hence are within the realm of the second law of thermodynamics, and the feeling of the flow of time. Symmetry-breaking divergent processes create information, but only ever make a fleeting appearance in our universe (the pencil, balanced on its point, ended up falling this way, or that way) before convergent processes take over again (once the pencil starts to fall, there is ever more reason for it continue in the same direction). At the bifurcation decision points, new bits of information are required; running the collision backwards still results in the collision, but now convergent and unconditional. Emerging from a period where there is no Taylor expansion involves breaking symmetry, and the gaining of one bit of information, relative to the otherwise undisturbed path, to record which branch of the bifurcation was taken.
The need for Now has led to discussion of the implications on a block universe.
The corner-stones of the theory of operation of the previous machine classes have been principles of limitation: the Turing halting problem for computers, Heisenberg's uncertainty principle for mechanics. Constructing the next machine class by building machines upon machines and contemplating cycles within cycles bodes well that something similar might be done for the next machine class. This chapter started by considering latency, throughput and Kirchhoff, including the dissipation of matter (the flux of molecules in and out of an object), dissipation of energy (appearing as the second law of thermodynamics, and dissipation of information (appearing as decoherence and the measurement problem). The principles of uncertainty and incompleteness can be generalised. Human consciousness seems to have a need for the notion of Now in the same way that an imperative-language computer programs needs a program counter.
The overall value of having considered all this is reflected on in the next chapter, in the context of the original table of contents.