VAR logo English
Home page Services Past achievements Contact Site
map
Page d'accueil Services Réalisations précédentes Contact
Français


Quantum gravity, time and underlying objective reality

This page originally started out as an annotated index of articles in past issues of the New Scientist magazine (NS) in and around the subject of the search for a unifying theory of quantum gravity. Since then, it has developed into telling a more coherent story, with additional references to books in popular science, and phrases from replies on the Quora website and review comments). It is a living document, forever in a transient state of work-in-progress. It is in this way that it is grounded in experimental and observational input. However, it is something of a patchwork, with each fragment summarising an experimental result or conjecture reported in the popular science press, roughly stitched together with connecting text that draws out common themes, and produces a "whole that is greater than the sum of the parts" (albeit sometimes confounded by inconsistencies and incompletenesses). So, though inevitably not coming to any final conclusions, it is still able to point out possible new directions for investigation, testable by experiment.

Although not a formal academic source of experimental input, the New Scientist magazine is a particularly appropriate one. As a weekly magazine in popular science, it provides an overview across a wide spectrum of recent developments in science and engineering. In any case, the references can be followed up, later, in the formal academic literature. Moreover, it does not even matter if any (much) of the patching together of what journalists have reported, turns out to be wrong; one lesson that the Newton-Ralphson method teaches us is that, though a bad initial guess is obviously worse than a good initial guess, it is still infinitely better than having made no guess at all, and breaks the initial symmetry.

Contents:

Approaches to a theory of quantum gravity

Maxwell demonstrated that electric and magnetic forces are manifestations of the same phenomenon. The weak nuclear force was subsequently shown to be unifiable with this electromagnetic force (NS, 30-Mar-2002, p28) and later still, the strong nuclear force was shown to be unifiable with this electroweak force. The next step, therefore, is to show how the gravitational force can also be unified with quantum chromodynamics (NS, 01-Feb-2020, p35). There is, however, no guarantee that any such a description exists, or in a form that our primate brains can grasp (NS, 03-Mar-2018, p30).

Gravity is a tensor rank-2 force, leading to a "like charges attract" rule; so, since mass cannot take on negative values, the force causes matter to clump together additively. This is as opposed to the electromagnetic force, which is a tensor rank-1 force; unlike charges attract, and appear neutralised when viewed from a distance. The gravitational field is non-linear, since it interacts with itself, making it very difficult to quantise in the same way as for the other fields. It is made more difficult by gravity being such a weak force (NS, 22-Sep-2007) with experiments only sensitive enough for measuring the attraction of a 90 mg gold ball (NS, 20-Mar-2021, p21). There is the prospect of one day measuring the gravitational field of a powerful laser beam (NS, 06-Oct-2018, p20). Experiments are planned to test the Schwinger limit (about 1030 W/cm2) using petawatt lasers (NS, 26-Jan-2019, p40). (As an aside, there is even a suggestion how this could be used, in the distant future, as a means of communication (NS, 02-Feb-2019, p10).)

This chapter contains a number of sections, with a number of subsections distributed amongst them. In reality, the subsections do not exclusively belong to the section in which they have been placed, but are merely ordered that way for the convenience of the flow of the overall description.

Strings and branes

With string theory, the extra dimensions are assumed to be stunted, or curled up, into less than a Planck length. With Braneworld, they are assumed to be fully-fledged, amongst which our 3+1 dimensions form just a membrane (NS, 29-Sep-2001, p26). M-theory is an attempt at unifying the varieties of string theory, of which there are many (NS, 19-Apr-2014, p47; NS, 28-Sep-2013, p34), perhaps (though looking less likely, now) complete with the super-symmetry of SUSY (NS, 14-Nov-2009, p36).

The first step when encountering a new phenomenon is to start taking measurements; the second is to look for underlying relationships between the parameters; and the third is to propose physical mechanisms that might generate those relationships. Despite a century of intense study, quantum mechanics interpretations appear to be still stuck at the second stage, with phenomena characterised by probability functions, extremely successfully, but without any convincing explanation as to the origin of the values (NS, 28-Jul-2012, p28).

Wave-functions are either ontological, a manifestation of some underlying mechanism, or they are epistemic, merely human tools or abstractions that guide us to the correct answers (NS, 01-Apr-2017, p41). There is, indeed, a bewildering zoo of possible interpretations (NS, 28-Aug-2021, p40; NS, 22-Jan-2011, p30) many of which are mutually conflicting (NS, 14-Nov-2015, p14) and some experiments have already ruled out many classes of the epistemic view (NS, 07-Feb-2015, p14). The Quantum Bayesianism (QBism) view is that the wave-function is merely a summary, constructed by the human observer, of all the observer's knowledge of the system, and hence is just in the observer's mind and not a property of the quantum particle itself (NS, 10-May-2014, p32).

Wave-function collapse interpretations

The most obvious weak points in the Copenhagen interpretation are our lack of explanation for why the Born rule works (NS, 05-Nov-2016, p8) and the need to use renormalisation to remove the infinities. There is a temptation to wonder if the non-intuitive results of quantum mechanics could be telling us that something is wrong with our understanding, like an experimental version of the Reductio ad Absurdum proof in mathematics, and that one (or more) of our axioms is wrong (such as whether particles exist). Fields exist, and it is just our approximation of these (discounting the less likely paths through a Feynman path integral) that allows us to do calculations on convenient modules that we call "particles".

Fields (and hence, particles) are considered to be in states, and it is states that can be entangled (and is represented by the convolution of the two kets). Wave-function collapse is the conversion of the entangled superposition of several eigenstates to a single eigenstate, and the increase, to one, of one of the probabilities, and the dropping, to zero, of all others. Objective collapse interpretations that treat wave-function collapse as a physical process, and occurs on its own, with no observer required., but is non-deterministic. Spontaneous wave-function collapse is even stronger (NS, 28-Aug-2021, p36; NS, 16-Jul-2016, p30) occurring unprovoked, perhaps with some characteristic half-life. Spontaneous collapse was first proposed by Pearle, Ghirardi, Weber, Rimini in the 1970s, as a bridge between quantum behaviour and relativity, and made compatible with general relativity by Bedingham and Tusmulka (NS, 14-Apr-2018, p34). A similar aim is sought using Continuous Spontaneous Localisation (CSL) (NS, 29-Oct-2011, p8).

The use of the word 'observation', in the context of wave-function collapse, is an unfortunately chosen historical metaphor, since it seems to imply that conscious beings might need to be involved (NS, 02-May-2015, p33); which is not only disconcerting, but problematic, since we still do not have a definition of consciousness, even now, after millennia of investigation (NS, 04-May-1996, p20).

Many worlds interpretations

An alternative to wave-function collapse is to assume that the Schrödinger equation splits the universe in two. In some ways this just moves it to being a symmetry-breaking "which branch is which" problem. Smolin argues that these proposals are just devices for handling our lack of knowledge about the universe (NS, 17-Jan-2015, p24). Many of the versions would have implications for free-will (NS, 27-Sep-2014, p32).

Smolin uses Bell's theorem to construct a model (NS, 24-Aug-2019, p34) that perhaps involves branching to multiverses, and merging back again (NS, 24-Aug-2019, p7). Wiseman proposes a "Many Interacting Worlds" model (NS, 08-Nov-2014, p6) in which the behaviour of the quantum mechanical system is the blurred average behaviour from several universes (of the order of 41) that interact fairly strongly with each other. Over the years, many completely different types of parallel worlds, multiverse, and many worlds theories have been proposed (NS, 21-Jan-2017, p28). Tegmark (2015) presents a four-level classification (NS, 26-Nov-2011, p42) with the third level being the one that replaces the need for wave-function collapse.

  1. The causal patches that lie beyond each other's de Sitter horizons
  2. The 10500 different types of bubble de Sitter space-times in the KKLT string-theory landscape (NS, 20-Jul-2019, p34; NS, 02-May-2009, p35). It is because of this large number that String theory is notoriously unfalsifiable (NS, 14-Jul-2007, p30)
  3. Many worlds, with quantum wave-function collapse replaced by branches in Hilbert space. Bousso and Susskind suggest that this is just an extension of level-1 (NS, 04-Jun-2011, p8)
  4. Different mathematical structures for the universe (NS, 15-Sep-2007, p38; NS, 21-Nov-1992, p36), each founded on a particular set of symmetries. Symmetry is one property that manifests internally, without the need for a third party vantage point outside. There is a bounded periodic table of all the symmetries that are possible in mathematics (NS, 14-Jun-2008, p38) with 10 classes, perhaps related to the 10 dimensions of some versions of string theory (NS, 17-Oct-2020, p40). (This might also be related to the debate as to whether mathematics is invented or discovered (NS, 16-May-2020, p44; NS, 02-Sep-2017, p30) and perhaps implicated in the way that the sub-atomic particles can be mapped on to a 248-vertex E8 pattern in 8D space (NS, 17-Nov-2007, p8).)

Hidden variables interpretations

Various attempts have been made to explain how the probabilistic behaviour could arise from an underlying deterministic behaviour (NS, 23-Jun-2007, p30). One is a refinement of the pilot-wave idea (NS, 22-Mar-2008, p28) and new ways of viewing the Bohm interpretation (NS, 27-Feb-2016, p8), and how it might even be supported by an experiment on classical waves in an oil bath (NS, 08-Apr-2017, p28). However, Bell's theorem seems to argue against this. It tells us that, if the results of quantum mechanics experiments are valid, our problems and objections are either with the 'at a distance' (relativity), the 'action' (objective reality) or the 'spooky' (ghost in the machine, free-will, consciousness) implications of entanglement (NS, 26-Feb-2011, p36; NS, 03-Aug-2013, p32). For the third of these, it might not necessarily be the observer's free-will that is in question, but an inherent limit on our ability to close all the loopholes in our freedom to perform independent experiments (NS, 18-Jun-2005, p32). Experiments have subsequently been performed that significantly tighten up on these loopholes (NS, 17-Nov-2018, p28; NS, 05-Sep-2015, p8), including ruling out the possibility that the measurement of the state of one of the particles could tamper with the mechanism of the random number generator (NS, 11-Feb-2017, p7). Experiments have also been run to test whether the Bell Inequality is affected by the presence of conscious human minds in the decision as to which parameter to measure in each entangled state of a pair of particles (NS, 27-May-2017, p7).

Thermodynamic gravity

The long-sought Theory of Everything might simply be already embodied in the second law of thermodynamics (NS, 13-Oct-2012, p32). Sean Carroll argues (NS, 14-Sep-2019, p34) as follows: Due to quantum uncertainty, we do not know exactly what answer we will get if we measure the field at some location, but entanglement means that the answer that we get at one point will be correlated with what we would measure at any other point. Entanglement entropy increases the more entangled the quantum degrees of freedom of the fields in a given region are with the rest of the world. With Susskind and Maldacenna's ER=EPR (NS, 07-Nov-2015, p30) the amount of entropy must be proportional to the area bounding that region; and whenever entanglement is increasing, energy necessarily flows, thereby yielding Einstein's original equations for general relativity.

Copy-pasting from an answer on Quora: the stretching and thinning at any point in a volume (like is a strand of bubble-gum) imply a “flow of momentum” in different directions in space because stress has dimensional units of change in momentum per unit time per unit area. A three-dimensional volume requires a 3x3x3 tensor, and a four-dimensional volume requires a 4x4x4x4 tensor (but this can be simplified down to a 4x4 tensor because of the symmetry of isotropic space). The "flow of energy" (not part of the stress tensor but part of the stress-energy tensor) is analogous but involves not a direction in space but in time. The fundamental relation of general relativity, called the Einstein Field Equations, simply says "Einstein tensor=constant×stress-energy tensor". As momentum and energy flow, spacetime curves, and as spacetime curves, momentum and energy flow.

Externally, we see a collection of mass and energy as just mass; internally, we are able to see the energies of the component parts. Mass is a static (or at least localisable) property, while energy has to have a flow, from hot regions to cooler regions (taking finite time to do so). This is apparent in an external combustion engine, and similarly in an internal combustion engine, leading, by extension, to the notion that the inlet fuel line is also 'hot', in some abstract way (a coded way of saying that it represents a low entropy input). By this means, the above notion can be extended to lean-burn and metabolism. The mean of any energy movements looks like a current, and the standard deviation like a temperature. The energy current is able to turn a metaphorical water-wheel (such as a spctial one like a physical paddle-wheel, or temporally like a fly-wheel) with the input energy, E, flowing up to the wheel, and being divided into the outflow and the axle of the wheel, and leads to the notion of machines being a type of catalyst. Where there is an energy current, there is an entropy increase.

It is affected time that we perceive as gravity (rather than gravity causing time to be affected). Time, and hence gravity, are an emergent behaviour of entanglement (NS, 18-Mar-2017, p28) with gravity just a consequence of interactions between entangled bits of quantum information, and hence just a consequence of the second law of thermodynamics (NS, 23-Jan-2010, p6). Conversely, where there is an entropy increase, including through the increase in entanglement through decoherence, there must be an energy flow (a convection current). This leads to the possibility of building a quantum motor (NS, 18-Jun-2022, p41): Gavin Crooks notes that (isolated) entanglement can act as a resource akin to low entropy, and hence we can drive an engine with entanglement.

Quantum trajectory theory

The collapse of the wave-function is a gradual evolution (NS, 28-Mar-2020, p34) and the measurement operator should be considered more as a process, and takes a finite time (NS, 10-May-2003, p28). By performing only weak measurements, it can even be interrupted or even reversed (NS, 12-May-2007, p32). Similarly, quantum jumps between different energy states take time, and hence are reversible given the right reversing stimulus during the jump (NS, 08-Jun-2019, p8).

If the laws of the universe are just effective laws, and have evolved to their current state (NS, 30-Jun-2007, p30) the answer to a question like, 'Why these laws and not any others?' is like asking, 'Why these species and not any others?' (NS, 23-Sep-2006, p30). Quantum nature is perhaps just a manifestation of economy of information (NS, 15-Nov-2014, p28); if a given effect follows from a given cause 80% of the time, it would be inefficient for the universe to encode the given cause with 100% coverage.

Memory does indeed appear to be key to an object's identity (NS, 27-Oct-2018, p31). Due to quantum confinement, protons and neutrons, though composite particles, do not manifest their internal structure (NS, 04-Dec-1993, p28). Even the carbon atoms in Buckminster fullerene molecules (NS, 17-Mar-2007, p36; NS, 15-May-2004, p30) have no discernible landmarks, and similarly for water molecules in a river (even though they do have internal structure). Leggett-Garg inequalities provide a test for whether any given macro-sized object is behaving in a classical or quantum mechanical way (NS, 06-Nov-2021, p38). Experiments with molecules of 430, or even 2000, atoms have been shown to exhibit wave-particle behaviours in the double-slit experiment (NS, 28-Aug-2021, p49). But when it comes to molecules of RNA or DNA, with inherent memory, these could (in principle) be configured to record which slit of the experiment they had passed through. Either this will prevent us from ever seeing anything other than particle-like behaviour when passing viruses, or bacteria through the double-slit experiment (NS, 09-Mar-2002, p26), requiring a superposition of states of methylation and non-methylation, or telomere damage and non-damage, or A-G and C-T base-pair swapping; or it will cause a rethink of the meaning of the double-slit experiment.

Firewall paradox

There is a problem of the disappearance of information when matter enters the black-hole, and its subsequent reappearance as Hawking radiation (NS, 04-Feb-2017, p16; NS, 27-Jul-2013, p10) and it still being available to forensic, crime-scene investigators at the event-horizon (NS, 19-Sep-2015, p11). Moreover, the increase of entanglement entropy happens as the total entropy of the black-hole reduces. So, the black-hole information paradox arises because the event-horizon would be too small to hold all the information, leading either to implications of there being a so-called firewall at the event-horizon (in contravention of relativity, which holds that there should be no detectable landmarks at that part of the curvature of space-time) or else an increased speed of light inside the black-hole (NS, 06-Apr-2013, p38). However, at the event horizon, the discrete packets get blurred out over space and time, and the states of the emerging Hawking radiation might even be entangled in time with earlier versions of itself, thereby blurring the information further over time (NS, 25-Sep-2021, p34). Perhaps the in-falling information is not completely destroyed in the wave equations (NS, 26-Mar-2022, p10). From our perspective, black-holes never get formed; for external observers, a black hole is 'an event horizon in formation', with nothing meaningful inside (not only untested experimentally, but outside our known laws of physics).

Maybe the entropy of the Hawking radiation can be presented as a Generalised uncertainty principle (GUP) (NS, 10-Feb-2018, p15). The many-worlds interpretation (Tegmark's 3rd level) could resolve the firewall paradox since it is in the many-worlds that the information content is conserved, not in the individual branches (NS, 06-Jan-2018, p14).

Holographic mapping

Another attempt to resolve the firewall paradox is to propose that the whole universe is a hologram (NS, 16-Jul-2011, p6; NS, 17-Jan-2009, p24; NS, 27-Apr-2002, p22) and that reality is somehow lived out, distributed round a two-dimensional surface. Such a holographic nature of the universe might be detectable via any consequent violation of Lorentz symmetry, if space-time turns out to have a preferred 'weave' in specific directions (NS, 16-Aug-2003, p22). The lambda-CDM cosmic principle rests on there being an even distribution of matter, on large enough scales, in all directions, but this assumption might be questioned by recent observations (NS, 13-Mar-2021, p14).

Along with Causal Dynamical Triangulation (CDT), in which the granuals of space-time are tetrahedrally, nearest-neighbour connected in an (n-1)-dimensional topology (NS, 14-Jun-2014, p34), Quantum Einstein Gravity, Quantum Graphity and Internal Relativity (NS, 03-May-2008, p28) each allow a connectivity that can vary between infinite (thereby doing away with the need for cosmic inflation as an explanation) to 3+1 (thereby modelling normal space-time) and reducing to 2 on small scales (perhaps hinting at a holographic principle).

Maldacena's Anti-de-Sitter/conformal field theory (AdS/CFT) correspondence, which permits a hologram-like conversion between quantum field theory without gravity in a four-dimensional space-time, and quantum gravity with a negative vacuum energy in five dimensions (NS, 12-Oct-2013, p36) would only work if the curvature of space-time were negative (NS, 30-May-2009, p34) and would imply a universe without any global symmetries, and their corresponding conservation laws. As noted earlier, though, Cao and Carroll have shown that it might be possible to make this work for a universe that has a flat geometry, too, if there is a conservation law for the amount of entanglement in a given volume of space-time (NS, 23-Dec-2017, p10). The AdS/CFT duality could then offer a way to unify the space-time view of relativity with the particle-field view of quantum mechanics (NS, 04-Feb-2017, p28). Information could then have an interpretation in terms of space-time (NS, 07-Nov-2015, p30) with space-time arising as a consequence of entanglement in the quantum field.

Quantum thermodynamics

For relativity, the link is to the area of the event horizon of a black hole; for quantum mechanics, the link is to entanglement entropy (NS, 07-Dec-2019, p34). Classical thermodynamics is derived from the study of motions of many particles; quantum thermodynamics is derived from the study of entanglement and superposition of energy states in just a few particles (NS, 07-Apr-2018, p32). The arrow of time might then be due to the states of the particles of the universe dissipating information, and their states becoming ever more entangled (NS, 04-Feb-2017, p31). Causality ceases to apply when there is entanglement (NS, 03-Aug-2013, p32). Measuring one of a separated but entangled states of a pair of particles can be used to improve the uncertainty on one of the measures in Heisenberg's uncertainty principle, arbitrarily close to zero (NS, 30-Apr-2011, p28). This process is called squeezing. In quantum mechanics, the principle of unitarity is the principle of conservation of information (and determinism). A more fundamental principle, underlying quantum mechanics, might involve abandoning the principles of causality and/or conservation of information (NS, 16-Jun-2018, p28) and to the concept of quantum causality (NS, 18-Jan-2020, p34).

Atomic clocks can only be improved up to a certain limit, beyond which point they become a different tool, such as a detector of gravity-waves, or a test of quantum entanglement in a gravity field (NS, 18-Jun-2022, p44). One clock, composed of a 1mm stack of 30-atom pancakes of strontium, promises to start functioning as such a tool (NS, 26-Feb-2022, p21; NS, 18-Jun-2022, p44).

Material and light spiralling in towards the event horizon of a black-hole can even be used as a tool for looking back at short recordings of the surrounding universe (NS, 01-Aug-2020, p30). Simply recording a movie-film of the behaviour of the accretion disk is the next step for the team of the Event Horizon Telescope (NS, 21-May-2022, p9) now that they have produced still immages of sgr-A* and M87* (NS, 25-Jun-2022, p46) albeit with other teams calling these images into question (NS, 28-May-2022, p14).

Wootters proposes a physical manifestation of what we presently handle in mathematical models of quantum mechanics using the square-root of minus one, to account for the increase of information at the symmetry-breaking of wave-function collapse (NS, 25-Jan-2014, p32).

"Causal patch measure" is a way of deriving statistical measures across a multiverse that is equivalent to, and produces comparable results to, the holographic principle (NS, 06-Mar-2010, p28). Meanwhile, the resolution of the search for the 3D Ising model, via a bootstrapping approach, could solve many problems concerned with phase transitions, many-body strongly coupled systems, superconductivity, and quantum mechanisms of the strong nuclear force and the AdS/CFT duality (NS, 18-Feb-2017, p28).

Relational interpretation

The relational Interpretation (NS, 13-Mar-2021, p36) starts from the point that the universe is constituted of relations, not of absolute things (such as particles or waves). Our measuring devices do not measure position or time, for example, but separation and duration. Relativity starts from the premise that the laws of physics are the same in all reference frames, and then provides transformations that show how things change from one reference frame to another. A similar starting point is needed for quantum mechnanics, too, showing, for example, how from the perspective of an electron in a double-slit experiment, it is the two slits that are in a state of superposition, not the electron. Just as a particle can be in superposition of being in two locaions, so a location might be able to be in superposition of containing two particles, such as in the quantum pigeon-hole principle (NS, 02-Aug-2014, p8).

This does not immediately work for the Wigner's Friend thought-experiment, since Wigner and his friend are not looking across the same Heisenberg Cut. Indeed, the key is that they are not looking at each other at all. It is only when they start to communicate that the reality that they have in common, starts to take shape. When Alice sends Bob a hundred electrons, all in a state of spin-up, Bob might be measuring spin-left and obtaining random correlations, until, gradually, the frame between him and Alice shifts (NS, 05-Feb-2022, p38) at each measurement. "'To gain information about a quantum system, you have to pay energy.' Every time Bob chooses the correct axis, he loses a bit of energy; when he choses wrong and erases Alice's information, he gains some. Because the curvature of space-time depends on the energy present; Bob ends up changing the orientation a tiny bit." The coordinate system becomes non-Euclidian, and non-commutative (a measurement involving moving five units up followed by another to move two units left, does not end in the same position as applying the measurements in the other order). With algorithmic information theory (NS, 11-Nov-2017, p28) with its quantum relativity, participatory-universe view of interactive collaboration, both Wigner and his friend can be right, in their respective contexts.

Since "GR says that the causal structure can vary (since there is no such thing as 'simultaneous'), and QM says that anything that can vary can be in a superposition" (NS, 31-Mar-2007, p30) there is the prospect of building a Quantum Gravity Computer, in which the output does not necessarily need to follow input.

Locality and diagonal matrix form

Newton disliked the implication in his theory of gravity. Einstein found a way of eliminating it using local effects of a global geometry. It reappears, though, under Heissenberg and Bell, as a spooky action at a distance. It disappears under holographic interpretations. It is interesting that the strong nuclear interaction is stronger as the distance is increased; so, a strong nuclear force Marconi would have been able to signal over vast distances, if it had not been for the strong nuclear Faraday screening of confinement.

Quantum mechanics is a non-local theory (local realism is not supported) in which we are restricted to making only local measurements. When a change is made to the context of the question that is being asked of the quantum system, it changes the answer that will be received back; non-locality is just one possible manifestation of contextuality and the ability to distinguish between reruns of the experiment (NS, 16-Mar-2019, p38).

The non-locality aspect of entanglement can be resolved by retro-causality (NS, 17-Feb-2018, p28) since even if a future choice affects an unknowable measurement now (such as a particle's position or momentum) it cannot be used to communicate from the future to the past. Quantum post-selection leads to retro-causality (NS, 20-Nov-2010, p34; NS, 30-Jun-2007, p18; NS, 30-Sep-2006, p36), and has been demonstrated over the 3500 km distance of a transmission to a satellite and back (NS, 04-Nov-2017, p12). Retro-causality perhaps implies a Wheeler-like participatory universe, and is also tied up with the reasoning behind It from Bit. Since the information content of the universe is limited by its size, in the early universe, maybe it was more amenable to this quantum post-selection, with the universe fine-tuning itself for the conscious beings that would eventually evolve to observe it. Thus, quantum post-selection can lead to a variant of the anthropic principle, and to a sort of quantum Darwinism (NS, 28-Aug-2021, p37), acting so that the fittest (most durable) wave-function collapses predominate, thereby allowing an objective reality to emerge that can be measured by independent observers. "Quantum Darwinism says that the preferred [observable] states are those that disseminate copies of themselves in the environment so as to more easily allow a set of independent observers to reach a consensus about the result of the measurement." Maybe quantum post-selection could, through extended synthesis mechanisms (NS, 26-Sep-2020, p45) applied to quantum Darwinism, lead to some notion of future humanity being our creator.

All the universe has some effect on all the rest, albeit with a speed-of-light time delay. Gell-Mann and Hartle (1990) describe how matrices become successively dominated, at each particle-particle interaction, by the terms on the leading diagonal. Thus, it is the repeated particle-particle interactions that lead to localised behaviour. As a result of the virtual particles of the quantum vacuum, an accelerating or rotating body should experience the glow of the quantum vacuum (NS, 03-Nov-2001, p30) perhaps contributing to an explanation for the apparent correspondence between inertial mass, gravitational mass, and the other three fundamental forces (NS, 03-Feb-2001, p22).

Recasting general relativity in terms of torsion instead of curvature leads to Einstein's teleparallel gravity, capable of explaining away phenomena that we presently attribute to cosmic inflation, dark energy, dark matter and the Hubble tension (NS, 14-May-2022, p46; NS, 29-Jan-2022, p10; NS, 16-Oct-2021, p46) and one of the research aims of the James Webb Soace Telescope (NS, 09-Jul-2922, p43). Considering the curvature in momentum-energy space, instead of in space-time, relative locality drops out more readily (NS, 06-Aug-2011, p34).

Using amplituhedra as a sort of multidimensional version of Feynman diagrams (NS, 29-Jul-2017, p28), the results of calculations in quantum chromodynamics are equated to the multidimensional polyhedron's volume. Not only does the method generate tractable and correct results, but it suggests that 'locality' is an emergent property. Unfortunately, at present, the tool only works for super-symmetric quantum mechanics.

Loop quantum gravity

With Loop Quantum Gravity (LQG) the subatomic particles are considered to be the manifestation of vibrations in the granuals of space-time at various modes. It originated from work on Wilson loops and spin-networks (NS, 22-Jan-2005, p33). Experiments have been proposed that might determine whether space-time is quantised, and at what granularity (NS, 07-Mar-2015, p12; NS, 15-Aug-2009, p26). Rovelli argues that the granularity must be at the level of the Planck dimensions (NS, 06-Jul-2011, p14). Other experiments are proposed, to look for astronomical evidence of black-holes that collapse to the quantum loop size, and then rebound as a white hole emitting at a characteristic frequency (NS, 15-Dec-2018, p3; NS, 02-Jan-2016, p32).

One major incompatability between general relativity and quantum mechanics is indeed in their respective views on the nature of time: relativity views time as a local, nearest neighbour phenomenon, while string theory views it as an externally present framework. Loop quantum gravity offers one possible resolution to this, by proposing that the string length, in string theory, is somewhat longer than the granual size in LQG, thereby explaining why string theory sees space-time coordinates as an external background mesh (Smolin (2001)).

LQG replaces the notion of a top-down, external, all-encompassing framework of space-time coordinates with a bottom-up, nearest-neighbour, local interface between atomic granules of space-time. Like a cellular automaton (NS, 21-Jun-2003, p32; NS, 06-Jul-2002, p46) it involves just nearest-neighbour communications, working on simple, local rules whose amassed behaviour (summed over huge assemblages) approximates to our familiar laws of physics, perhaps via the mechanism of a holographic projection (NS, 11-Mar-2017, p28).

Perhaps the unification of quantised, natural number algebra with continuous geometry could be a key (NS, 28-Apr-2018, p30). Markopoulou suggests that all of the fundamental particles might consist of qubit-like braids of space-time (NS, 12-Aug-2006, p28), which might then explain why the universe appears quantised (NS, 10-Nov-2001, p40), and hence the significance of knot-invariant properties (NS, 18-Oct-2008, p32). Entanglement, too, might be explained by topological properties of the states of the fundamental particles (NS, 08-Jan-2011, p10). The long-sought proof of the Riemann hypothesis, with the zeros of the zeta function for prime numbers all lying on the vertical line 0.5+n.i (NS, 22-Mar-2008, p40) might even be found first in the energy states of a suitably chosen quantum system, such as a large atom or molecule (NS, 11-Nov-2000, p32), with a possible connection, too, with proving the Schanuel conjecture (NS, 21-Jul-2007, p38).

Constructor theory

Constructor theory is ultimately more general than any quantum physics interpretation, being a theory of a universal assembler of things, in the same way that a universal Turing machine is to computation (NS, 17-Apr-2021, p34). It is a theory that addresses what can and cannot be done physically. While Newtonian physics addresses the observation of what does happen, constructor theory addresses what can happen (the actual) and cannot happen (the counterfactual). A message only contains information if an alternative (counterfactual) message had been possible (NS, 24-May-2014, p30). Many of our tools argue in the static, including: for-all array operators in APL, and cluster states; symplectic integration (NS, 19-Mar-1994, p32); and the aim in physics of seeking invariant properties. Examples include the way that, despite all interactions in physics being reversible, stirring a beaker of fluids in the opposite direction neither cools it down nor unmixes the constituent fluids.

The laws of physics can be viewed as arising from constructors that work on information (NS, 24-May-2014, p30). They could also give an explanation of what constitutes knowledge (NS, 01-Apr-2017, p30), and the role of a knowledge creator (such as a conscious mind). Since the laws of physics of the universe must reside inside the universe, not as some abstract concept outside, they can only be as precise as can be calculated from the total information content of the universe. Allowing our theories to work with an infinite number of numbers, quotable to an infinite number of decimal places, shows that mathematics is an approximation of the universe rather than the other way round (NS, 17-Aug-2013, p32).

Gell-Mann and Hartle (1990) talk of "information gathering and utilising systems" (IGUS) with possible implications on free-will (p454; NS, 01-May-2014, p34) and how it is not the observer (or the act of observation) that creates reality (p453).

Viewing the universe as a quantum computer

Wheeler's "It from Bit" (Zurek (1990)) proposes that all the matter of the universe is made of information (NS, 17-Feb-2001, p26). It is then tempting to consider the universe to be a vast quantum computer in much the same way that people in the previous industrial revolutions considered the universe to be like a giant system of wheels, a giant heat engine, or a giant conventional computer. Even though never the complete answer, this is still a useful approach because advances in the new technology usefully feed back into our understanding of how the universe works.

Distributed processing, optical computing, quantum computing, DNA computing, are all possible avenues to get us back on to the curve of Moore's Law (NS, 14-Mar-2020, p40). It is possible to consider Moore's law for conventional computers continuing up to the Planck limit (NS, 02-Sep-2000, p26). Some suggest that quantum computing could beat the Turing Halting Problem (NS, 06-Apr-2002, p24; NS, 19-Jul-2014, p34) though it can be noted that if the travelling salesman problem is representative, it decidedly counts as computable. Whether any replacement of the Turing halting problem counts as different, or a mere rewording of the original, might just be a matter of taste. Chaitin describes a hierarchy of Omega numbers that would remain forever non-computable (NS, 10-Mar-2001, p28) though this is later questioned by a demonstration of the computation of its first 64 bits (NS, 06-Apr-2002, p27). Meanwhile, the polynomial hierarchy (PH) can be beaten by bounded-error quantum polynomial (BQP) algorithms (NS, 09-Jun-2018, p7).

Since the idea of space-time implies that all four axes are to be treated similarly, many have contemplated the possibility of time-travel (NS, 20-Sep-2003, p28) with some arguing that relativity is ambivalent on the possibility (NS, 20-May-2006, p34). Many paradoxes have been highlighted, along with many attempts at their resolution (NS, 28-Mar-1992, p23). Hawking proposes a chronological protection conjecture (NS, 08-Oct-2011, p50), and some argue that the second law of thermodynamics could mean that time-travel, though perhaps possible, must cost the time machine at least as much energy to run as is needed to achieve the entropy change that the time-travel brings about (though there is a paradox in this, too, since the time-traveller going back in time with the intent of making a change, would be doing so having already learned that there was no change to be made). Deutsch (1998) links quantum computing, the multiverse interpretation, and a generalised proof of the Turing halting problem, using the diagnolisation of Cantor, to argue the consequences for the prospects of time travel.

Tools

Random-matrix theory might be used as a tool (NS, 10-Apr-2010, p28), as might the same sort of deep-learning program that was originally developed for playing Go (NS, 24-Aug-2019, p38; NS, 28-Oct-2017, p36; NS, 18-Feb-2017, p12). Google's DeepMind has been used to help develop conjectures and theorems in mathematics, specifically in knot theory (NS, 11-Dec-2021, p16) and protein folder, with potential to be applied on other problems (NS, 11-Dec-2021, p28).

Quantum interaction is a field of study that uses the techniques of quantum mechanics for their applicability in other fields (NS, 03-Sep-2011, p34; NS, 11-Dec-2010, p10).

Causal inference (NS, 25-Apr-2020, p32) is a tool for statistics that introduces a 'do' operator to distinguish between cause, effect and correlation.

General relativity applied to an empty, flat region of space-time, gives rise to an infinite series of super-translation symmetries (NS, 20-Oct-2018, p45).

Fluctuation relations theory allows for the localised peaks and troughs from the mean to be factored into the treatment (NS, 12-Jun-2021, p46).

One tool for describing systems with multiple materials is the fractional order derivative (for example, between first order differentiation and second order differentiation, there are fractional orders like the 1.6th order differentiation operator) which can be used when modelling complex real-world problems that have multiple phases of materials, and behaviours that exhibit memory (NS, 13-Nov-2021, p44).

The first split second

The first split second after the big bang is one particular situation where a theory that unifies general relativity and quantum mechanics is particularly sought (NS, 24-Sep-2016, p28). It is also the one in which many of the hierarchical problems and fine balances manifest themselves (NS, 16-Feb-2019, p36):

  • The mass of the Higgs particle is almost, but not quite, cancelled out by those of the other particles, by a factor of 125:1019 (NS, 19-Jul-2008, p36) and apparently is not explainable by SUSY
  • Dark energy (NS, 17-Feb-2007, p28) might have been explained by vacuum energy, had there not been a massive 120 orders of magnitude discrepency (NS, 01-Nov-2003, p34)
  • Dark matter dominates matter in the ratio of 4:1
  • Dark matter interacts so weakly with matter
  • Matter and energy exist in a ratio of around 1:109, and hence with marginally more matter than antimatter (NS, 29-Feb-2020, p44; NS, 23-May-2015, p28; NS, 12-Apr-2008, p26), in the ratio 1000000001:1000000000
  • The mass of the neutrino is so much less than that of the other particles. This might also imply the existence of a heavy right-handed flavour of neutrino (NS, 03-Apr-2021, p22; NS, 11-Apr-2020, p34).

Adding yet an extra space and an extra time dimension, but constrained by the gauge symmetries that led to Heisenberg's uncertainty principle, would lead to holographic principles that explain the connection between electron orbits round an atom with the expansion of the universe, and between quantum chromodynamics and the lack of evidence of anyons (NS, 13-Oct-2007, p36). Adding some sort of axiflavon particle could explain the unexpected conservation of CP symmetry in strong force interactions, the disparate masses of the six quarks, cosmic inflation, the origins of dark matter, and the surprisingly light mass of the Higgs particle (NS, 18-Aug-2018, p28).

It has often been suggested that the coupling constant of gravity is so much smaller than those of the other three fundamental forces because the force 'leaks' away to the extra dimensions of string theory (NS, 27-May-2017, p28), and that this might also explain why the "anti-gravity" of dark energy is so much weaker than the vacuum energy (NS, 14-Mar-2009, p38). However, the experimental evidence is now against there being any such leakage (NS, 24-Nov-2018, p10). Meanwhile, there would also be many implications if the graviton turned out not to be massless (NS, 11-Jul-2020, p30).

Emergence of something rather than nothing

We are fortunate in the way that inductive logic works in our universe: by Noether's theorem (NS, 25-Apr-2015, p33; NS, 27-Jul-2013, p50) the chemical experiment and the clinical trial that we do today, in the controlled environment of the laboratory, are valid indications of how they will work tomorrow in the shopping mall. At anything less than galactic super-cluster scales, space is Euclidian, and so is time (running a physics experiment one minute earlier than planned, or one minute later, or one metre over to the left or right, or orientated NWbN or NWbW, has no effect on the results that are obtained from the experiment).

During periods of cosmic inflation, and also in regions that experience cosmic accelerating expansion, though, energy is not conserved. Meanwhile, the zero-sum argument proposes that the energy of the contents of the universe is balanced by a corresponding negative value elsewhere, and hence why there is something rather than nothing (NS, 03-Sep-2016, p32).

As to why the something manifests itself as matter, there are many experiments attempting to observe instances of neutrinoless double-beta decay (NS, 13-Feb-2016, p30) and the possibility of CP-violation in baryon reactions (NS, 11-Feb-2017, p14). This would also have implications as to why time has a forward-flowing bias (NS, 22-Nov-2008, p32). Perhaps neutrino flavour changing contributes sufficiently to CP violation (NS, 25-Apr-2020, p14).

Emergence (or not) of cosmic inflation and dark energy

Cosmic inflation was originally proposed as a solution to why the universe is now so uniform (NS, 03-Mar-2007, p33) though it is not a solution that is universally accepted (NS, 17-Aug-2019, p42; NS, 07-Jan-2008, p30) and has yet to be confirmed by direct observation (NS, 19-Jan-2019, p38). It is believed to have started at 10-35 s, when the universe had a size of 10-27 m, and ended at 10-32 s, when it had a size of 103 m.

The present accelerating cosmic expansion is based on astronomical observations, perhaps even implying an eternally inflating universe with multiple bubbles (NS, 26-Jun-2021, p26). It, too, is not universally accepted, though (NS, 11-Apr-2009, p6).

Emergence of a low-entropy state

The initial, symmetry breaking (NS, 22-Oct-1994) low-entropy starting state of the universe is, at first, a puzzle (NS, 08-Oct-2011, p39; NS, 15-Oct-2005, p30). One answer is that the reason that the big bang nuclearsynthesis created hydrogen and helium rather than iron is that it is still partway through that process of creating iron; we are still within the big bang event. Meanwhile, when the cosmic microwave background (CMB) radiation came into being, the universe was extremely uniform, which sounds like a very high entropy state at the end of one type of process, but now represents a low entropy state for gravity, the now dominant long-distance force, which works towards clumping matter together. There must have been a point at which the universe switched, for example at the emergence of the Higgs field, or of cosmic inflation, or from the unified fundamental forces being dominant to gravity separating out and becoming dominant (NS, 12-Nov-2005, p27).

Emergence of the Higgs field

The universe is balanced in a state of unstable equilibrium between false vacuum and true vacuum (NS, 29-Oct-2016, p32) with the possibility of this being confirmed experimentally (NS, 04-Dec-2021, p12). Up to 10-36 s after the big bang, the quantum chromodynamic forces were indistinguishable (NS, 25-Nov-2017, p9), but then the strong nuclear force became distinct. At temperatures still over 1015 K, which persisted until 10-9 s, the electroweak force was still unified, and the charged fermions massless. Normally, a field has the lowest energy when it is free of excitations, but with the Higgs field, when it is in its lowest energy state, some excitations are still present, leading to the so-called Mexican hat shaped curve (NS, 14-Nov-2015, p36). So even when starting from a vacuum with no excitations, it decays into this lower energy state, with some excitations present, and characterised by the vacuum expectation value. When the symmetry was broken, and electromagnetism and the weak interaction became distinct, the particles mediating the weak interaction became massive (making the interaction very short range, which we perceive as being weak), and charged fermions now interact with the vacuum expectation value, which we perceive as mass (NS, 03-Jul-2021, p44). It has also been noted that the symmetry-breaking event that turned on the Higgs field (NS, 03-Nov-2018, p14) could also explain cosmic inflation, if the Mexican Hat curve has outlying alps (NS, 10-Jun-2017, p30).

Emergence (or not) of dark matter

Dark matter was orginally proposed to explain the distribution of orbital motions of the stars within galaxies (NS, 16-Nov-2019, p34). It might consist of primordeal black-holes, some of which might have left crater evidence on the moon (NS, 02-Oct-2021, p46). Alternatively, it might be made of particles that do not react, or perhaps only weakly, with any of the forces other than gravity (NS, 15-Aug-2020, p24). Candidates include neutrinos, sterile neutrinos (NS, 13-Nov-2021, p28), WIMPS, or axions that then get coaxed into a Bose-Einstein Condensate (BEC) state; or, maybe particles in some mirror universe (NS, 08-Jun-2019, p34). Another possibility is that it is associated with a fifth fundamental force, a dark force (NS, 17-Apr-2021, p14; NS, 03-Apr-2021, p18; NS, 16-May-2020, p30; NS, 26-Nov-2019, p11; NS, 24-Sep-2016, p28) and that furthermore it is this, rather than the Higgs field, that gives neutrinos their mass, in which case this mass would be variable with the amount of dark matter in the region (NS, 24-Mar-2018, p16). Experimental evidence hints at a possible fifth force, either via a Z' carrier that acts asymmetrically between Bottom to Strange decay with the emission of a pair of muons or a pair of electrons, or else via a unifying leptoquark (NS, 15-Jan-2022, p38; NS, 18-Jun-2022, p42).

There is a discrepency between the clumpiness predicted and observed in the distribution of dark matter (NS, 08-Aug-2020, p34). Perhaps dark matter does not exist at all, and it is our models of gravity that are wrong at scales beyond our normal experiments, and that Modified Newtonian Dynamics (MOND) applies instead (NS, 02-Apr-2016, p30). However, patterns in the CMB seem to support the dark matter explanation (NS, 06-Mar-2021, p26) as do the frequency of galaxy mergers and the slowing of the rotation of the Milky Way (NS, 19-Jun-2021, p16). The blending between the two mechanisms could involve switching from the BEC-state to an ordinary dark-matter state, depending on the strength of the gravity field, and hence if the entanglement in the states of the particles generated from the vacuum energy can be modeled as a type of elasticity (NS, 02-Apr-2016, p30) that deforms like Einsteinian gravity at short ranges, and extends like MOND at longer ranges.

The split between a quantum and a classical macroscopic universe could be thought of as a 'quantum death' of the universe (NS, 29-Mar-2014, p32), prior to which there was no speed-of-light limit on information transfer, such as via entanglement. Perhaps, even today, at sub-Planck scales, faster-than-light communications between non-quantum particles still takes place (NS, 29-Jun-2002, p30). If only a minority of the particles generated in the big bang subsequently became quantum in nature, dark matter might be made up of the remaining particles; being non-quantum, and still capable of superluminal-speed communication, these particles might be expected to interact only weakly with the quantum particles of ordinary matter.

Several of the balances at the start of this section might collectively be explained by axions (NS, 12-Dec-2020, p18; NS, 25-Jul-2020, p46; NS, 27-Jul-2019, p7; NS, 14-Nov-2015, p36), whose hypothesised existence in the present epoch of the universe might eventually be detectable as dark matter around the event horizon of a black hole (NS, 01-Jun-2019, p46) and might be found by searching for a particular type of Bose-Einstein Condensate star (NS, 12-Dec-2015, p11). Indeed, some astronomical objects, presently classified as black-holes, and some day analysable from their emissions (NS, 06-Oct-2007, p36) might turn out to be BEC stars (NS, 15-Jul-2017, p28).

Temperature, mass, and information content of the universe

The table on the right combines the data from two articles (NS, 03-Aug-2002, p28; NS, 05-Jul-2008, p28) plus a few other data points (NS, 25-Nov-2017, p9; NS, 03-Mar-2018, p9) but not taking into account a different way of estimating the lifetime of the universe based on the mass of the Higgs boson (NS, 24-Mar-2018, p8). Using a bit of reverse engineering, this seems to indicate a Xn/Xb=1-(xn/xe)4 law, where: Xi=log(ti.Ti2) and xi=log(ti/tb). (The former requires a scaling of its argument by √(ℏ3.c5/G) which is about 2x10–25.) This equation can be rearranged to give values of Tn as a function of tn, and hence to give the function T(t):

T = exp( ln(Tb√tb) . ( 1 - ( ln(t/tb) / ln(te/tb) )4 ) - ln(√t) )

where the suffices (b, e, n) stand for 'beginning', 'end' and Now, and where:
tb = √(G.ℏ/c5) = 5.39x10–44 s
Tb = √(ℏ.c5/G.k2) = 1.42x1032 K
Te = (ℏ.c3)/(8π.G.k.mU)
te = 1/Te2
tn = 13.813x109 years
Tn = 2.7281 K

Time (s)Temp (K)
Planck limit5.39x10-441.42x1032
Grand unification split10-361028.5
Inflation starts10-351028
Inflation ends10-321026.5
Electroweak split10-91015
Antimatter disappears10-62x1013
Primordial pion soup10-42x1012
Thermal equilibrium era (p:n=1, γ:(p+n)=109)10-21011
Hydrogen era (p:n=6)1001010
Deuterium era (p:n=7)102109
Helium era (300 years)10105x104
First atoms (CMB) (385000 years)1.3x10133x103
First stars (150 million years)4.8x101536
Accelerated expansion beats gravity (8 billion years)2.52x10173.75
Today (13.813 billion years)4.36x10172.7281
Full equilibrium (5.3x1052 years)1.66x10607.77x10-31

The mass of the observable universe has been determined (from the standing-waves frozen in the CMB) at mU=1053 kg (NS, 16-Dec-2000, p26; NS, 19-Oct-1996, p30). Since its present temperature, Tn, is greater than its Hawking temperature, Te, this means that it is out of equilibrium, and will continue to cool. The question addressed in the 03-Aug-2002 New Scientist article is, "at what time will this equilibrium state be reached?" The answer reported seems to have assumed, perhaps only implicitly, mU=7x1051 kg. The figures above, though, indicate mU=1.58x1053 kg, and hence te=5.3x1052 years.

Following a 1979 paper by Freeman Dyson (NS, 03-Aug-2002, p28) it would at first appear that machines in general (and living cells, and thinking brains, in particular) can continue forever to eek out a slower and slower existence, right into the heat death of the universe, as the amount of temperature variation gets smoothed out. However, the universe is expanding, with the monotonic disappearance of potential resources over the de Sitter horizon, a manifestation of the second law of thermodynamics (NS, 15-Apr-2017, p8), and hence out of reach of future generations (NS, 20-Oct-2001, p36). Moreover, since each machine will warm, it will need to dissipate heat as it works, so the Hawking temperature imposes new constraints on the duty cycle of the machine.

If IU(t)∝(mU)2.t2 and if T2.t was roughly constant in the early universe, this suggests that IU(t)∝(mU)2/T4 in the early universe. According to Davies (1990), the Hawking-Bekenstein formula for the entropy of a black-hole, the information content of the observable universe, in nats, is given by IU=G.(MU)2/(ℏ.c). Further, since this would be expected to have increased to its present value, starting at unity at one Planck time after the big bang, this would lead to IU(t)=(t/tpl)2.G.(MU)2/(ℏ.c). Since the universe is now 8x1060 Planck times old, this gives the observable universe a mass of 8x1060 Planck masses (namely, 1.76x1053 kg) and, by Davies' formula, an information content of about 2x10121 nats.

Flow of time

Along with space, time is an emergent property (NS, 15-Jun-2013, p34) and is tied in with why there are three dimensions for space (NS, 28-Sep-2013, p34). There is no absolute time, just durations that are relative to my point of observation (NS, 08-Oct-2011, p37) and just an internally derived view (NS, 08-Oct-2011, p41).

Smolin (NS, 23-Sep-2006, p30) notes that by abstracting out time as just a dimension of space-time, all the physical laws become constant, invariant, and outside of time (NS, 20-Apr-2013, p30), which is strange for a universe that has existed for only a finite time (NS, 22-Nov-2008, p32), though this could merely be an indication that all physical laws are approximations that break down beyond some limit, or at least need to be generalised. Moreover, such a view leaves us with no way of explaining the ultimate breaking of symmetry of why we have the concept of Now, and indeed of past, present and future (NS, 21-Apr-2018, p28).

Gell-Mann and Hartle (1990) note that multiplication by the off-diagonal zeros, implicit too in the And and Or operations when manipulating probabilities (p441), is an irreversible operation (p452). History requires knowledge of both present data and the initial condition (p439) and results in the notion of time (p437) via its partial ordering. This leads to the mechanism of Heisenberg's uncertainty principle (p455) and how the many worlds view should really be thought of as many histories. From our perspective, there is no definite history of the universe. Using the present state of the universe as the input to infer its origins is more in keeping with scientific method, and certainly more practical, than attempting start at the big bang and to obtain the present state of the universe as the output (NS, 22-Apr-2006, p28). Indeed, the post-inflationary state of the universe might involve a Feynman path integration of the 10500 initial states of string theory (NS, 28-Jun-2008, p10). Entanglement in time (NS, 27-Aug-2016, p12) consists of the states of a particle becoming entangled with those of an earlier version of itself (NS, 27-Mar-2004, p32). The temporal version of the Bell inequality confirms that, given a particle's initial state and final state, it is not clear that it had any definite history of intermediate states (NS, 04-Dec-1993, p14), or paths in Feymann's diagrams. With Susskind's proposed minus-oneth law of thermodynamics, there are no histories where two distinct states lead to the same state: all physically distinguishable states remain distinguishable, and Liouville's theorem holds that no information can be created or destroyed. Consequently, with a bifurcation in the non-intersecting paths of a phase-fluid, the two branches would never reconverge, but remain forever distinct.

Any description that uses the word 'dynamic', or attempts to explain the manner of the arrow of time 'arising', entropy 'growing', and the 'period' 'when' all this did so (NS, 16-Jan-2016, p8) is either using self-contradictory time-dependent terminology, or implies some hypothetical absolute time, outside our universe, that is distinct from, albeit parent to, the local time that we experience within our universe. Meanwhile, the extended state space of Hamilton mechanics seems also to confirm that considering time as just another coordinate, q0, involves transferring the time-like indexing to a new variable, τ. All such concepts therefore make some direct or indirect reference to dtlocal/dtabs or dt/dw or dt/dτ.

Speed of causality arrow of time

It is the speed of light (speed of causality) that is nature's way of making sure that things do not all happen at once (NS, 04-Mar-2017, p28). It sets a minimum latency on all processes within the universe. There are various suggestions that the speed of causality, c, might not a constant, but has only settled asymptotically at the value that we observe today. At the intense energy densities of the first split second after the big bang, the speed of causality might have been much higher (NS, 26-Nov-2016, p8), with a specific testable prediction about a measure called the spectral index for the CMB, which should be 0.96478. However, new values for c could merely rescale the other constants, not least the fine structure constant (NS, 17-Mar-2007, p36; NS, 25-Mar-2006). The speed of information can be faster than the speed of light in the given medium (faster than the group velocity) (NS, 18-Oct-2003, p42).

The apex of my light cone coincides (viewed on a Minkowski diagram) with those of my family and friends, as we bounce ideas and information off each other. Since the brain is a distributed, parallel system, the same thing is happening at a finer granularity within a single conscious brain. Perhaps it is no surprise that we all agree on the present moment that we each call Now; like the systematic bias of considering the light inside the refridgerator, I only ever interact with things and beings that agree with me as to the moment we call Now. The cursor that we call Now is the solution to multiple simultaneous equations: akin to solving the simultaneous equations of "You are here". Points of simultaneous equation resolution, when considering waves crashing on a sandy beach, make up the individual moments of Nowness. However, there is still the question as to why I count this particular moment along my worm-like path through space-time to be Now, as opposed to the one two minutes ago that I was then perfectly happy to call Now. Indeed, this is the only sense of the phrase, "flow of time," since all others would question, "flowing with respect to what?"

Curvature of space-time arrow of time

If the arrow of time is just a consequence of the curvature of space, this would avoid the need to postulate a period of inflation; unfortunately, though, this would only work if the curvature of space-time were negative (NS, 15-Oct-2005, p30). Interestingly, Susskind suggests that a universe that had to tunnel through the string-theory landscape (Tegmark's 2nd level) would also lead to it having a negative curvature (NS, 02-May-2009, p35). A local negative curvature could also be what has resulted from the large voids that have formed (NS, 15-Nov-2008, p32) due to clumping of the galaxies over the most recent 5 billion years (NS, 24-Nov-2007, p34) as a back-reaction to space telling matter how to move, and matter telling space how to curve, and that we live within such a void (NS, 18-Jun-2016, p28; NS, 08-Mar-2008, p32). Importantly, this might explain the results that we currently attribute to dark energy. One problem with ascertaining the actual curvature is that it becomes a chicken-and-egg problem with ascertaining the expansion (NS, 01-Aug-2009, p40). Moreover, there is also a possibility that mass could distort time and space differently (NS, 24-Oct-2009, p8).

The logic is that time-flow comes from gravity, and hence from increasing entanglement. One electron will bend space-time, ever so slightly, in its vicinity. A second electron, probably with the opposite spin, will also bend that already bent space-time. So, the effect is additive, and something with as many electrons and quarks as a planet will be perceived as a proportionately greater bending of space-time (or rather, the other way round, the proportionately greater bending will be perceived as the mass of the planet). However, Tegmark (2015) argues that gravity can play no part in wave-function collapse, since gravity can be taken out of the AdS/CFT equation (NS, 11-Feb-2017, p24).

When the position of a massive particle is in superposition, it must have implications for the curvature of space at those two locations (NS, 03-Jan-2015, p26) and exhibiting enough non-linearity to invalidate the underlying assumptions for superposition. Penrose suggests (NS, 09-Mar-2002, p26) that it is neither a coincidence that it is when the effects of gravity start to become noticeable that the Standard Model starts to break down as a workable approximation of how the universe works, nor why gravity is the one force that resists being unified with the other three. It could even be because of gravity that quantum entanglement experiments are so difficult to perform on Earth (NS, 20-Jun-2015, p8) though there is the possibility that the ripples of passing gravitational waves affect macroscale-sized objects, but have little affect on subatomic scale ones (NS, 21-Nov-2009, p12).

Cosmic expansion arrow of time

Dark energy might be explained by the creation of information through the breaking of symmetry due to wave-function collapse, as proposed by Sudarsky, Josset and Perez (NS, 23-Sep-2017, p8). Cosmic expansion provokes its own acceleration, as a result of the second law of thermodynamics and the increasing of entropy (NS, 15-Apr-2017, p8). Newtonian and Einsteinian gravity has already been successfully modeled this way, but in anti-de-Sitter space, and so not yet in our universe in which the vacuum is not quiescent, though perhaps modified Newtonian dynamics could be used to address this.

Using causal set theory, Dowker (NS, 02-Feb-2019, p44) proposes that the flow of time is how cosmic expansion manifests itself, and is a measure of the new qubits that are being added (NS, 04-Oct-2003, p36). Similarly, Muller (2016) proposes that as space-time expands, it is the new coordinates of time that feel like Now, and give rise to the feeling of the flow of time. Ironically, it his antihero, Eddington, who could provide a mechanism to support this view (random motion cancels out when summed over large enough distances with respect to left/right, up/down or clockwise/anti-clockwise, but not when the summation is in the receding/approaching direction, on sufficient cosmic scales).

Another proposal is to equate the flow of time to the increase in complexity, rather than to entropy, with the consequence of assuming a mirror-image period before the big bang (NS, 06-Mar-2021, p46).

One popular idea is that a black-hole might contain an independent bubble universe (NS, 09-Jan-2016, p8) along with its associated information. Moreover, it could be that the universe is filled with primordial black-holes, where the information is stored holographically in their event-horizons, but with those black-holes at varying densities because of Heisenberg's uncertainty principle; our habitable part of the universe would then be in one of the low density (and hence low entropy) regions, where radiation is able to permeate (NS, 28-Apr-2007, p28).

It appears that these alternative mechanisms, at least the ones discussed here, all point back to the second law of thermodynamics being involved.

Thermodynamics arrow of time

It is not reality that has a flow of time, but our approximate, lumped-parameter statistical knowledge of it that leads to the perception (NS, 19-Jan-2008, p26); along with our inability to process information in a quantum way (NS, 28-Jul-2018, p7). Measuring time ever more accurately can only be bought only at the cost of increase in the entropy (NS, 01-Jan-2022, p46; NS, 15-May-2021, p15).

It seems that the flow of time derives from our sense of identity, that the person who started reading this sentence is related to, but definitely no longer the same, as the person who ate breakfast this morning: a mixture of the memory of that event, and of the changes since then. We measure the flow of time against our laying down of a permanent and ever-growing trail of past memories (NS, 06-Jul-2019, p32), and these are asymmetrically memories of past events, never of future ones. Just like the characters in a movie film, we cannot tell if the film is being run backwards or at a different speed in the projector.

Human brains lay down their chains of thought in memory, and these necessarily only progress, as a consequence of Landauer's principle, in the direction of the energy flow from the hot regions to the cooler ones. The apex of my light cone coincides with that of the light cone of my computer program, part way through its computation, or my internal combustion engine and the contents of its fuel tank, or that of a dragonfly in my garden. Our species has come to its present position due to many factors: these include having big brains, opposable thumbs, and bipedal wolking, but also being a highly social animal. Not only does this give the advantage of speicialisation within the society, and repositories of knowledge (temporally in libraries, and spatially in centres of learning) but also, it seems, this highly reinforced notion of Now.

Implicit and explicit time

Science seeks to capture truth in the form of time-invarient properties and laws. Underneath, time is always involved, implicitly, such as in the values of p in the Hamiltonian. Like the oscillating fields of a photon, the pendulum part of a clock is timeless, despite it being precisely the mechanism that imposes the interval of time on the whole clock, since, as a result found from Fourier analysis, a sinusoidal wave that continues the same for ever carries zero information.

Convergence in the block world leads to two contrasting views, firstly that of the water molecules being channelled into a river in sub-second timescales, as opposed to that of the rocks that line the river valley in decade timescales.

By the Weak Anthropic Principle, life (even unintelligent) is only possible in a universe that contains cyclic behaviour (regular or irregular, and of one sort or another). Persistence of a lifeform relies on using past experience to give a statisticall bias to how it reacts when something similar holds again in the future. This further requires that there must be capability for memory. Memory then implies the second law of thermodynamics, as per Landauer's Principle. It is this that breaks the symmetry of an otherwise information-free situation of eternally repeating cycles, and could be where the feeling of a moment called Now starts to emerge. Perhaps we have evolved to treat Now as a special concept because so much of the universe works with parameters like force and power being functions of the present state, with no memory of past or future states (NS, 01-May-2004, p34).

Causality and the laws of cooling

Both quantum mechanics and relativity agree that there is no such thing as simultaneous: the speed of light sets the speed of causality, and the non-commutative terms in Heisenberg's uncertainty principle set the partial ordering on cause and effect. They both also agree that an absolute zero temperature is unattainable, because assemblies of moving particles cannot travel on perfectly parallel paths, with certainty in their positions and velocities. The two results fit consistently with Heisenberg’s uncertainty principle: neither the Δt of near-simultaneous, nor the ΔE/k of the third law of thermodynamics can be zero.

Distinction between potential energy and kinetic energy

Intuitively, we think of the potential energy as the cause, and the kinetic energy as the effect, such as when we raise a pendulum away from its equilibrium point, but the two terms then alternately swap roles as the pendulum swing proceeds; similarly, with the electric and magnetic fields of a photon. In any case, we know that the universe makes no distinction between potential energy and kinetic energy, and that the distinction is purely down to the context of the human description, when deciding whether to classify a given term in with the Ts or Vs, in Lagrangian mechanics (Coopersmith, 2017). The arbitrary distinction is explained further when we consider a circular planetary orbit. As an example of simple harmonic motion, the planet's displacement and velocity measured along the x-axis is taken as one energy store, and along the y-axis as the other, with the energy of the orbit periodically transferred between the two. This seems a somewhat degenerate way of defining energy stores, and of course, relies on the subjective view point of the observer as to what defines the x and y-axes.

Similarly, a flywheel can be run up to speed by a small electric motor, which then turns into a dynamo during a power outage, long enough to run a computer for its shutdown sequence. The flywheel system has the black-box behaviour of a battery, so is looked on as a potential energy source, despite its internal innards storing the energy as kinetic energy. Changing the example slightly, a switch could be made from the flywheel being driven by a fully-charged chemical battery, to charging a fully-discharged one. The second law of thermodynamics indicates that energy will flow from the energy-rich part of the overall system, to the energy-poor part. In any given wire inside the motor/generator, the direction of the current will have reversed, despite the magnetic field and kinetic motion remaining in an unchanged direction. The motor/generator must switch from Fleming's left hand rule to right hand rule. One distinction between the two is the arrangement of cause and effect: in the left hand rule, the electric current and magnetic field are the cause, and the motion is the effect; while in the right hand rule, the motion and magnetic field are the cause, and the electric current is the effect. But, then, as the flywheel slows down, it reaches a point where the barely-turning flywheel has the same energy as the barely-charged battery. So, really, there is no switch between the left and right hand rules, but a point at which the magnitude of the current in the right hand rule goes negative, and its role as effect becomes cause. (Noteably, though, the motion also changes, from cause to effect, but without changing polarity.)

Distinction between heat and work

The distinction between heat and work also appears to be man-made and subjective: it is the intentional discarding of the divergent collision information that distinguishes them. The heat dissipation in the load (the computer system connected to the flywheel and generator, for example) could be defined as work, if heating the air or a conveniently-placed pan of water above equilibrium temperature had been the original aim for buying the computer. The definitions of dE/dt=P and dE/dx=F illustrate the distinction: how short a time is required to extract heat from a heat bath, versus over how long a distance a force can be resisted (where the difference in wording between short and long is again a reflection of the change of perspective between which one is considered to be the cause, and which one the effect).

This all indicates the part being played by the human mind's heuristic of packaging groups of components into quasi-autonomous modules with outwardly black-box behaviour (what constitutes "a chair", for example). The human mind is finite, and somewhat limited. We can only handle complexity if we can modularise it, and handle it at distinct, autonomous levels. A river, or a star like the sun, can be a turbulent chaos within, but a snooker-ball-like, black-box, timeless simplicity outside. The amount of externally visible motion would have been dwarfed by the amount of internal motion, had it not been for the way that all that internal motion can be cancelled out and ignored from our calculation. The ratio of the two amounts could be considered to be some sort of Strouhal-like ratio: outwardly visible energy trend versus the amount of energy that is flapping around inside. For simple harmonic motion, the ratio of energy in one store versus that in the other follows a sin(ωt+0)/sin(ωt+π/2) curve, namely tan(ωt). This merely tells us that the energy is periodically swinging from one store to the other. One of Carnot's great insights was that, by noting the parameters at successive returns to the same state round the cycle, all the periodic fluctuation of the energy distribution can be left out of the calculation, in a way somewhat akin to taking a carefully chosen moving average, allowing the designer to focus just on the underlying trend in the flow of energy to heat and work.

A machine requires there to be a hot bath (a region of abundant energy) and a cold bath (a region of dearth), and a channel through the barrier that separates them, with a mechanism analogous to a water-wheel placed in the channel. In an exothermic chemical reaction, the barrier is an energy threshold, and the channel might be a catalyst. The apparatus of a man-made machine, such as a cylinder and piston arrangement, can therefore be viewed, too, as a type of catalyst (NS, 24-May-2014, p30). The convergent constraints imposed can be quantified as information, measured by Shannon entropy, and is what distinguishes work from energy (NS, 12-Aug-2017, p40). In this way, information, too, is what makes the difference between living and non-living matter (NS, 02-Feb-2019, p28) with feedback loops establishing themselves between the software domain of Shannon entropy and the hardware world of molecules and thermodynamics.

Distinction between information and entropy

Intuitively, we think of the information as the cause, and increased entropy as the effect. However, when the piston in a Szilard engine has been pushed to the far extremity of the box, the answer of one question, "Is the particle on the left or right of the piston?" can be answered with 100% certainty, while a seemingly similar one, "Is the particle in the left or right hand end of the box?" is completely unknown, with a 50:50 chance of being answered correctly. This is the distinction between information (Shannon entropy) versus statistical thermodynamic entropy. It is the human user who decides to care about the answer to the former question, and to be disinterested in the answer to the latter. It is, indeed, related to the way that work is that part of the energy exchange that is the subject of interest and to be tracked; all the rest is swept under the carpet as mere heat loss.

One very productive thought experiment imagines a helium balloon travelling through space. The path of each of the atoms of helium can be described by giving its present position and velocity, and then dead-reckoning after that (putting aside Heisenberg for the present). As engineers, though, we do not have time and resources to track 6x1023 particles, and have found that it is sufficient to characterise them as having a single mean position and mean velocity, with a standard deviation about each of those means. It is an engineering, statistical technique for reducing the number of degrees of freedom, treating the whole balloon assembly as a single particle, at least as an approximation within a given error bound (NS, 21-Apr-2018, p44). So, it is us, for expedient, pragmatic reasons who have chosen to sweep aside vast swathes of information, and to simplify the problem. Entropy is a measure of how much information is being ignored, and allows us to get a handle on estimating the consequences of relying on our simplified calculations. The mean position of any given helium atom can be given as that of the centre of the balloon, plus or minus a figure that is related to the radius of the balloon. The mean velocity of any given atom is that of the centre of the balloon (with respect to the launch point, for example) plus or minus a milling about of the atoms within, relative to the centre of the balloon. The mean velocity of the assembly of helium atoms is the current, and gives an indication of how much work that body of helium might be able to do if it were to collide with the face of a piston or turbine blade. The standard deviation velocity of the assembly of atoms is perceived as temperature.

We have chosen to measure temperature on a linear scale, thereby making a ratio of the internal energies equal to the ratio of the temperature differences. The ideal machine is one that is able to extract 100% of the energy from its input, bringing the particles of that input to a halt (zero kinetic energy) and hence absolute zero temperature. This is what mesmerisingly almost happens on a snooker table and Newton's cradle, but rarely in a more divergent, dissipative system. It can be considered as Rankine's missed opportunity: rather than inventing a new temperature scale (or as well as inventing it) he could simply have added one more term to the equation for calculating the Carnot efficiency, and to note that it measures the energy extraction actually achieved, proportional to (Tbefore-Tafter) divided by the amount that we might have aspired to having extracted, (Tbefore-Tabszero). Similarly, the inefficiency can be expressed as (Tafter-Tabszero) divided by (Tbefore-Tabszero) and is Carnot's measure of the ratio of unutilised heat to (work plus heat) while ignoring all the internal energies inside the gases.

This further emphasises that we do not measure absolute energies, but the difference in the energy of interest from the base case. The potential energy of a 1kg lump of coal held 1m from the floor is 9.81J, unless the intention had been to let it fall to the ground rather than the floor of a second-floor physics laboratory, or down to sea level of one built on a cliff-top near the coast, or if we had intended the coal to be burned on its way down, or to undergo nuclear fusion.

Lastly, on the face of it, it is no more meaningful to ask what is the temperature of a single particle than it is to ask what is the standard deviation of a single data value; however, physicists do find it useful to shoehorn a meaning on to it. Starting from the ideal gas equation (P.V=n.R.T=ΣE), and setting n=1/(6x1023), thus giving k.T=E, we end up defining the temperature of a single particle to be E/k, and its entropy as being constant, k. Even though the milling-round internal velocities, Δv, of the helium atoms of the gas, is now considered to be the external velocity, v, of the alpha particle (say), it is still plugged into the gas equation as its internal velocity. Moreover, since its entropy remains constant, k, all of the particle interactions can be considered to be reversible. It is in this sense that the whole balloon of gas, colliding against a piston or turbine blade, can be considered to be a single Lagrangian particle, with a Strouhal-like ratio of nRT/½mv2 (where n is the number of moles of atoms of gas, and m and v are the mass and velocity of the whole assembly).

The running out of energy is expressed in our thermodynamics equations as a measure that we quantify as entropy, and is what we notice as the difference between the past and the future (NS, 20-Nov-2021, p39). Oxygen atoms that have bound with two hydrogen atoms tend not to unbind; a pencil balanced on its point can fall (divergently) in any direction, but once it starts falling, will continue to fall in that same direction. Like a river in a valley, the meandering path of evolution is divergent at decision points, but convergently snaps on to the fittest solutions in its path across Wright's adaptive landscape (with genetic drift (NS, 26-Sep-2020, p46) being akin to a river on a level plateau). Similarly with the snapping on to structures, such as quarks settling as nucleons, which in turn settle as stable isotopes (notably after a violent event like a supernova, nova, kilonova, or a collapsar (NS, 24-Jul-2021, p46)), in turn as molecules, and in general, the self-perpetuating structure of so-called time-crystals (NS, 14-Aug-2021, p20) now shown to be possible using a continuous laser (NS, 18-Jun-2022, p12). The path through phase-space takes on the form of a non-intersecting phase-fluid.

Knife-edge boundary of chaotic behaviour

The second law of thermodynamics does not give us a magnitude for time. Instead, we get that from the maximum velocity of light, and by H.U.P.. From the hot spots to the cold, separated by finite distances, the energy transfers attempt to go as fast as they can, but are limited to occur in finite time. At any given point along the flow, there is a pressure from upstream to flow out faster, but a resistance from downstream not to flow further that way. At any point, therefore, the energy flow will attempt to burst out into a parallel flow. (The mid-point is not as cold as the final destination, but is still colder than the hot-spot at the head).

Since this aspect of the first law of thermodynamics resists the Second, they form the basis of "frustration" and "funnelling" (NS, 09-Jun-2001, p32), and so the conditions for chaotic behaviour, and perhaps the inevitable emergence of life. All sorts of patterns of activity will set up, powered by the flow: stable and unstable ones. By definition, the unstable ones will die away, leaving the stable ones to persist. There is a bias towards the setting up of stable patterns in systems that have energy flowing through them (the standing waves of Miller and Tierra space). The basis of an implied extra, fourth, law of thermodynamics, perhaps (NS, 05-Mar-2005, p34).

These midway points represent points of local minima, with respect to the hot-spot at the head. If they can leak their energy away, to the cold-spot, as fast as it is flowing in, they become just an extra part of the main flow. But, if they cannot leak the incoming energy away, or while energy is flowing into them faster than it is flowing out, they represent a store of potential energy, and become local hot-spots, themselves. We quantify this extra flow as "work", and say that work is done by the main energy flow on whatever it is that makes up the midway point. The first law of thermodynamics, thus, gives us a statement about the spontaneous emergence (and later dismantling) of structure.

Convergence leads to funneling rules, while frustration is the preventing a system from returning to the position of least energy, and leads to the barrier that separates the modular objects. In a continuous system, this leads to a seeking of an equilibrium position, but in a discrete system, wherever there is a balance between opposing rules at work (NS, 09-Jun-2001, p32) pulling between building up or tearing down, there is the potential for rich and interesting chaotic behaviour that teeters on the boundary between deterministic and random, with the coexistence of symmetry and chaos (NS, 09-Jan-1993, p32). Conway’s “Game of Life” is a fine example, as are Mandelbrot sets, the action of the S and K combinators or Hofstadter"s wondrous numbers, the balance between electromagnetic force repulsion and strong nuclear force attraction within the atomic nucleus, the physical properties of water, the chemical properties of water, the dynamics of development within multi-cellular organisms, chemical day-to-day running within single cells, human consciousness, the dynamics of human society (NS, 06-Jul-2013, p26), the dynamics of species within ecosystems. Again, the process of evolution features heavily in this list, constantly tugged between competition and cooperation as way of maximising any given gene's chances for propagation, along with each of the selfish genes tugging and pulling against each other, giving the external impression of a 'non-random eliminator' (NS, 07-Jan-2006, p36).

Quantum mechanics, too, is positioned at the fine boundary of self-organised criticality, between classical physical behaviour and even weirder interconnectedness (NS, 26-Feb-2011, p36); it appears that it is entanglement that keeps physics from being even weirder than it actually is (NS, 21-Aug-2010, p33). To have the values of each particle's matrix nudged to new values, at each collision, is indeed reminiscent of how the Logistic Difference Equation works (xn+1=r.xn.(1–xn)). Even this simple deterministic equation can go chaotic (at values of r=3.7, for example) with no implications of spookiness, and with an associated temperature compared to that of black-body radiation (NS, 02-Feb-1992, p29). Coin tosses are often claimed to be deterministic, given enough detailed observation (the distance of the toss, wind speed, temperature, symmetry of the muscle contraction, and so on) allowing the outcome of the toss to be calculated in advance of the coin landing. However, this could also be said of the logistic difference equation and Mandelbrot sets: the state of the next pixel can be predicted in advance simply by calculating what value it will take.

"Tiny changes in the input lead to huge divergences in the later characteristics of the system. At the same time, some chaotic systems will always converge towards a particular set of characteristics. This can be captured in a mathematical entity called a chaotic 'attractor', which maps their movement through all the possible states towards these almost inevitable outcomes. The attractor has a curious, often-ignored property, however. In any chaotic system, there are states and situations that are just off-limits. The attractor’s blank space defines which states are impossible to access. (...) Superdeterminism violates something known to philosophers of science as 'statistical independence'. This is the idea that tweaking the input to an experiment should not change anything in the equipment set-up to detect the output. The free will problem arises because, in the Bell test, the experimenter has to be able to set the experiment up however they want to. Superdeterminism says this is not possible because there are hidden constraints." (NS, 15-May-2021, p36)

Cosmic inflation leads to the notion that vacuum energy is a sort of latent heat of phase change of some scalar field in a bubble universe, with the big bang a phase change from random behaviour to a more clustered state (NS,17-Mar-2018, p30). Similarly, consciousness, and each of the other examples of emergent behaviour, can be treated as a region of phase change (NS, 12-Apr-2014, p29; NS, 26-Nov-2011, p34) which appears related to how the holes in network topology make us smart (NS, 25-Mar-2017, p28) and the use of Algerbraic Topology to investigate the 7-dimensional (or higher) sandcastles that build up and collapse down (NS, 30-Sep-2017, p28) in self-organised critical neural systems (NS, 27-Jun-2009, p34). It all seem to suggest fractal nature.

Objective reality

One goal might be to show that the physical constants and laws of physics can all be derived from just a few simple principles, right down, perhaps, to deriving the curvature of space-time, cosmic expansion, the maximum speed of causality all derivable, mathematically, without reference to light, mass or energy (NS, 01-Nov-2008, p28). All of this makes the assumption that there is such an underlying reality to be modeled (NS, 02-Mar-2019, p7). Another possibility is that, even within a single frame of reference, it might all cut off at some less-than-fundamental level and require a holistic rather than reductionist view (NS, 29-Sep-2012, p34) perhaps floating on a sea of random fluctuations.

There are two potential major sources of systematic error in the standard model. One is that every question that we have ever asked of the universe is one that has been framed by a human mind, and, moreover, that that mind was never designed for this purpose, but just to detect indirectly-correlated representations of reality that turn out, on average, to be beneficial for survival (NS, 01-Feb-2020, p39; NS, 03-Aug-2019, p34). "None of us sees the world as it is. Our brains only process a fraction of the incoming sensory information (...). The gaps are filled in by the brain, which constantly makes predictions based on previous experience and then updates them in the light of new information" (NS, 30-Jan-2021, p40) thus accounting for subjects as diverse as optical illusions, multiple-drafts style working, scientific method, witness fallibility, fake news, religious cults, and psychological manipulation.

The other potential major sources of systematic error is that it might be because we design our experiments to investigate the properties of quarks or of photons that we appear to get answers that look like the properties of quarks or photons (NS, 05-Oct-2019, p42), and details like their properties of quantum complementarity (NS, 24-Jul-2004, p30). Toffoli (1990) shows that special relativity and general relativity might be artefacts of our scientific thinking, and Peres (1990) showed that the axioms of quantum mechanics necessarily lead to the second law of thermodynamics, and to the Schrödinger equation being linear. The adage, "if the only tool you have in your tool-box is a hammer then every problem ends up resembling a nail," could suggest that we should be looking for some new tools.

All this does run up against the impossibility of our attempting to think otherwise. Any notions of what was “before the universe” or “outside the universe” (for example, in some hypothesised parallel universe) becomes self contradictory. It ultimately relates to the question of why there is something, rather than nothing (NS, 20-Nov-2021, p37), and indeed what it would even mean for there to be nothing, since, in contemplating what "nothing" is like, we end up thinking of it as a thing. We do our thinking with words, or similar symbols, so give names to the things that we want to think about, such as ΑΩ for plus and minus eternity (NS, 08-Oct-2016, p52). Ultimately, though, these tools just are not up to this particular job, and we cannot expect definitive answers with our current vocabulary. Indeed, language is about charade-like improvisation, and a desire to be understood (NS, 26-Mar-2022, p38).

To the extent that the various theories and interpretations each capture some truth, they must blend together with each other at their common boundaries. The blending of Newtonian mechanics into special relativity via the Lorentz term, 1/√(1-v2/c2) is a fine example. The superbly heuristics-based brain that we inherit from our stone-age ancestors finds it hard to acknowledge that the speed of light is just very large, not infinite, and that velocities are not addative (when throwing a projectile from a moving platform, for example, or colliding two moving platforms).

Of particular interest is the blended region between general relativity and quantum mechanics (NS, 17-Mar-2007, p36) and a succinct expression has so far has proved elusive (NS, 20-Apr-2013, p34). Individually, they each blend into the special case of the Newtonian approximation, but they also need to blend together when conditions are extreme enough for them to be simultaneously applicable (NS, 18-Sep-2021, p18). The two theories agree in many aspects, notably that there is no such thing as simultaneous (there is a latency between cause and effect, and a non-commutativity in their ordering). They also both agree with the third law of thermodynamics, that absolute zero temperature is unattainable. General relativity and quantum mechanics also agree on the applicability of the principle of least action.

 

Top of this page Home page Services Past achievements Contact Site
map
Page d'accueil Services Réalisations précédentes Contact
© Malcolm Shute, Valley d'Aigues Research, 2007-2024