|Home page||Services||Past achievements||Contact||Site
|Page d'accueil||Services||Réalisations précédentes||Contact|
Considering the possibility of the second law of thermodynamics not holding, might allow for Maxwell's Demon-like behaviour to have been possible, and hence for new energy to have been created out of nothing, at an exponentially increasing rate. This, of course, is exactly what appears to have happened in the first fraction of a second after the Big Bang. So, it is tempting to investigate if there could be a connection between the two ideas.
Unfortunately, this means that this exotic epoch must have occurred during a timeframe (5.39x10–44 seconds) where the currently known laws of physics do not hold, and where there is no prospect of the Maxwell's Demon-like behaviour being manifested in solid machines or devices (perpetual, or otherwise). On the plus side, this is precisely the timeframe where a theory that unifies General Relativity and Quantum Mechanics is needed (NS, 24-Sep-2016, p28).
There is a problem, of course, that if the presently known laws of physics cannot be used, what can? We do, at least, have one boundary condition as a guide: we know that whatever unknown laws of physics were applicable at the time, they must eventually collapse down to the known laws of physics in our epoch of the universe. Reassuringly, this further implies that the exotic regime was not perpetual at all, since the Maxwell's Demon-like behaviour would be using up some finite resource while it was creating the extra energy. This would serve to confirm the assertion at the start of this web page, provided that the first and second laws of thermodynamics are generalised to include the finite resources of the initial exotic regime.
This table is explored, later, in the section on temperature and mass of the universe.
To fit in with what we observe, all the freshly generated matter/energy would have to have been generated in a highly ordered, low entropy state, with the balance between matter and energy at around 1:109, and hence with marginally more matter than antimatter (NS, 23-May-2015, p28; NS, 12-Apr-2008, p26), in the ratio 1000000001:1000000000. Paradoxically, this implies the occurence of reactions (those that favour the production of slightly more matter than antimatter) which, in turn, implies a direction of increasing entropy.
There are several other very fine balances present in current theories. Several of these balances might collectively be explained by axions (NS, 14-Nov-2015, p36), whose hypothesised existence in the present epoch of the universe might be tested by searching for a particular type of Bose-Einstein Condensate star (NS, 12-Dec-2015, p11). Some astronomical objects, presently classified as black-holes, and in the future analysable from their emissions (NS, 06-Oct-2007, p36), might turn out to be BEC stars (NS, 15-Jul-2017, p28). Alternatively, for the marginal imbalance of matter and antimatter, there are many experiments attempting to observe instances of neutrinoless double-beta decay (NS, 13-Feb-2016, p30), and the possibility of CP-violation having been observed in baryon reactions (NS, 11-Feb-2017, p14) with implications as to why time has a forward-flowing bias (NS, 22-Nov-2008, p32). Meanwhile, if particles with negative mass had been possible, they would have represented negative energy, allowing the overall sum for the entire universe to remain at zero. For the force of gravity, though, like charges attract and unlike charges repel. So, this might explain why we find no examples of particles with negative mass: they were all repelled over the de Sitter horizon, to another part of the universe (and likewise us, from their perspective).
The initial, low-entropy starting point for the universe is, at first, a puzzle (NS, 08-Oct-2011, p39; NS, 15-Oct-2005, p30). When the cosmic microwave background (CMB) radiation came into being, the universe was extremely uniform, which represents a low-entropy state since the now dominant force (gravity) works towards clumping matter. There must have been a point at which the universe switched from another force being dominant to gravity being dominant (NS, 12-Nov-2005, p27). By analogy, wanting to play game after game of cards, but not wanting to put in the effort to shuffle the deck each time, the players could decide instead to alternate the type of game each time; what would have been an ordered arrangement to leave the deck in, for one game, is more of a random arrangement in the other. Likewise, the various switches in the early universe are changes in the rules of the game.
Up to 10-36 s after the Big Bang, the strong, weak and electromagnetic forces were indistinguishable (NS, 25-Nov-2017, p9), but the strong nuclear force then became distinct. At temperatures still over 100 GeV, which persisted until some 10-12 s later, the electromagnetic and the weak nuclear interaction were still unified, and the charged fermions massless. At lower temperatures, though, a distinct property of the Higgs field became apparent. Paraphrasing an answer that was found on Quora: Normally, a field has the lowest energy when it is free of excitations; but with the Higgs field, when it is in its lowest energy state, some excitations are still present, leading to the so-called Mexican hat shaped curve (NS, 14-Nov-2015, p36). So even when starting from a vacuum with no excitations, it very quickly decays into this lower energy state, with some excitations present, and characterised by the vacuum expectation value (with echos of why there is inevitably something, rather than nothing). The universe is balanced in a state of unstable equilibrium between false vacuum and true vacuum (NS, 29-Oct-2016, p32). As a result, the symmetry is broken, and electromagnetism and the weak interaction become distinct: the particles mediating the weak interaction become very massive (making the interaction very short range, which we perceive as being weak), and charged fermions now interact with the vacuum expectation value, which we perceive as mass.
The symmetry-breaking event that turned on the Higgs field could also explain Inflation, if the Mexican Hat curve has outlying alps (NS, 10-Jun-2017, p30). In any case, though, the Higgs field is another example of the early fine balances of the universe, in the way that particle masses, and in particular that of the Higgs boson, have almost cancelled themselves out, by a factor of 125:1019, in the so-called hierarchy problem (NS, 19-Jul-2008, p36).
The amount of normal matter seems to be dwarfed by the amount of dark matter, in a ratio of 1:4, is another fine balance, and is needed to explain the orbital motions of galaxies, and their maintenance of a spiral structure for a significant number of cycles.
Modified Newtonian Dynamics (MOND) was originally proposed as an alternative explanation, without needing to propose the existance of dark matter. A mechanism that switches from a Bose-Einstein Condensate state to an ordinary dark-matter state, depending on the strength of the gravity field, would allow a switching between MOND and inverse square law behaviour (NS, 02-Apr-2016, p30). It is further suggested that dark energy, too, and also gravity itself, could be emergent behaviour of entanglement (NS, 18-Mar-2017, p28). It is suggested (NS, 13-Oct-2012, p32) that gravity might simplly be a consequence of the second law of thermodynamics: gravitational attraction of two bodies is a consequence of their entropy needing to increase.
Gravity might just be a consequence of interactions between entangled bits of quantum information. Newtonian and Einsteinian gravity has already been successfully modeled this way, but in anti-de-Sitter space, and not yet in our universe in which the vacuum is not quiescent. However, it might still emerge if the entanglement in the particles generated from the vacuum energy can be modeled as a type of elasticity (that deforms like Einsteinian gravity at short ranges, and extends like MOND at longer ranges).
Several groups suggest that the speed of light is not a constant, but has only settled asymptotically at the value that we observe today. At the intense energy densities of the first split second after the Big Bang, the speed of light could have been much higher (NS, 26-Nov-2016, p8), with a specific testable prediction about a measure called the spectral index for the CMB, which should be 0.96478.
The split between a quantum and a classical macroscopic universe could be thought of as a 'quantum death' of the universe (NS, 29-Mar-2014, p32), prior to which there was a lack of the speed-of-light limit on information transfer via entanglement.
The switch from exotic regime to normal regime might happen in response to the creation of so much new material and energy (from the proposed Maxwell's demon mechanism) that the universe bloats out, what we presently attribute to inflation or expansion and is pushed into a new regime of operation (no longer sub-planck scale, so no longer able to support faster-than-light transmission, and Maxwell's Demon behaviour).
Perhaps, even today, at sub-planck scales, faster-than-light communications between non-quantum particles still takes place (NS, 29-Jun-2002, p30). If only a minority of the particles generated in the Big Bang subsequently became quantum in nature, dark matter might be made up of the remaining particles; being non-quantum, and still capable of superluminal communication, these particles might be expected only weakly to interact with the quantum particles of ordinary matter.
There are still many theories as to what dark energy is (NS, 17-Feb-2007, p28), and the fine balance of the cosmological constant almost cancelling itself out, but not quite, being lower than that predicted by quantum mechanics by a factor of 1:10120. (Interestingly, this means that this "anti-gravity" is weaker than expected, just as gravity is also weaker than the other three fundamental forces; some have suggested that this might be due to its acting in more than the usual 3+1 dimensions.)
Inflation was originally proposed as a solution to why the universe is now so uniform and flat (NS, 03-Mar-2007, p33; NS, 19-Oct-1993, p30) though it is not universally accepted (NS, 07-Jan-2008, p30). It is believed to have started at 10–35 s, when the universe had a size of 10–27 m, and ended at 10–32 s, when it had a size of 103 m. (According to this, vacuum energy can be considered as a sort of latent heat of phase change of some scalar field in a bubble universe.)
Even now, expansion causes energy to be created, and also its own acceleration, as a result of the second law of thermodynamics and the increasing of entropy (NS, 15-Apr-2017, p8) though there are some who question the inevitable increase (NS, 11-Apr-2009, p6). Inflation and expansion occur in regions where gravity is weak; consequently, a positive feedback loop causes expansion to accelerate exponentially as soon as it starts to weaken the gravity fields that were already present. Conversely, regions of space in galaxies, solar systems and planet surfaces experience no expansion.
If expansion had acted more uniformly, and been randomly distributed, there would have been a sort of continuous out-going wind of space-time coordinates, as dark energy causes extra ones to be inserted. Atoms, molecules, solar systems, galaxies and local groups would be able to resist this wind, since much stronger forces (electromagnetic, strong nuclear, weak nuclear and gravity) would be present. If a new voxel of space-time happened to pop up in our galaxy (between the orbit of Mars and the Sun, for example, or between an electron and a proton in a hydrogen atom), it would be manifest as an injection of extra energy. All the planets we can see are in a stable configuration, with stable orbits. If any space were to be inserted into any of these, their first reaction would be to collapse back down to their stable configuration. Thus, the expansion of space would be experienced as a miniscule increase in the amount of energy (albeit, so miniscule that it would be an undetectably slight increase in the probability of the emission of an extra photon). Expansion and inflation would have manifested themselves as the injection of new energy into the system, and over a certain spread of distance, it would appear as a force. Of the four forces, gravity is the longest ranging, but still becomes very weak at a distance: so unrelated clusters of galaxies are pushed apart by the expansion. This would further suggest that there is a point at which G.m1.m2/r2 balances with the force of the coordinate wind.
Consistent with expansion (or inflation) leading to energy creation, Noether's theorem indicates that a lack of symmetry in time implies we cannot assume a conservation of energy.
The two great pillars of modern physics, general relativity and the standard model of quantum mechanics continue to be used in isolation, with unprecedented success; but, when attempting to understand black-holes and the properties of the early universe, the two theories need to be combined, and so far, they have resisted all attempts at doing this. One question is where (and how) the switch-over occurs (NS, 17-Mar-2007, p36), and so probing the boundary region between relativistic and quantum behaviour is a fertile area for research (NS, 20-Apr-2013, p34).
There is a gradual blending of Newtonian mechanics into special relativity, illustrated, for example, by the expression for calculating the approach velocity of two bodies, travelling towards each other at v1 and v2: (v1+v2)/(1+v1.v2/c2). Something similar is required for the blending of Newtonian mechanics into quantum mechanics.
Algorithmic information theory (NS, 11-Nov-2017, p28) and view of a participatory universe of interactive collaboration, might become a sort of quantum relativity view points, simiilar to the way that special relativity describes ruler- and stopwatch-wielding twins who are passing each other at close to the speed of light: both Wigner and his friend can be right, in their respective contexts. This suggests that there is no objective reality out there for us to observe (hence the problems we are encountering with quantum superposition and entanglement). Instead, what we think of as objective reality is just an emergent behaviour, and one that is so much more probable than the alternatives, that we all converge on agreeing about it. By analogy, they point out that the molecules in a gas can, in theory, take up any configuration that they like, but in practice tend to converge on a well distributed arrangement that can be summarised by its pressure, volume and temperature.
As in Quantum Bayesianism (QBism), the wave-function is merely a summary, constructed by the human observer, of all the observer's knowledge of the system, and hence that it is just in the observer's mind and not a property of the quantum particle itself (NS, 10-May-2014, p32). Algorithmic information theory carries this further, and shows how objective reality could be an emergent property of the mathematics that tends to converge on the simplest laws of physics. Questioning the existance of objective reality as a fundamental property impacts Bell's inequality, as does whether the numbers in his comparison are non-commutative (NS, 01-Nov-2007, p36), or should be handled as octonion and quaternion numbers (NS, 09-Nov-2002, p30).
Omnes notes that if the universe is to be treated as an information system, we must first establish its basis in logic. Heisenberg's uncertainty principle then drops out from Gödel's incompleteness theorem, or some equivalent (NS, 14-Aug-2010, p34): the logic that leads to a measurement of the momentum of a particle, contains no statements to describe its position, so any statements about its position can neither be proved nor disproved. From a slightly paraphrased explanation found on Quora: "At the quantum level, physics is not characterized by numbers, but rather, by non-commuting quantities, which Dirac called q-numbers. Not all of them can be simultaneously number-valued. When you observe, say, the momentum of an electron (you set up an experiment in which the electron's momentum interacts with a classical apparatus, forcing the momentum to be in a so-called eigenstate, and hence number-valued) its position cannot be number-valued; this electron at this time has no classical position. You cannot measure what does not exist." As noted earlier, Heisenberg's uncertainty principle follows as a consequence of information theory and the second law of thermodynamics (NS, 13-Oct-2012, p32; NS, 23-Jun-2012, p8).
Similarly, fractals might also be used to explain how the opposing views of quantum mechanics and relativity might both be correct (NS, 28-Mar-2009, p37) and how some Gödel-like questions about the universe (such as, "what if the experiment had measured the momentum of the particle instead of its position") might have no answer because they do not lie on the same fractal coastline of some sort of scale-relativity universe (NS, 10-Mar-2007, p30).
M-theory is an attempt at a unification of many of the alternative variations of string theory (NS, 19-Apr-2014, p47; NS, 28-Sep-2013, p34), perhaps complete with super-symmetry known as SUSY (NS, 14-Nov-2009, p36), but with all of the contributers notoriously unfalsifiable (NS, 14-Jul-2007, p30). Bars proposes that adding yet an extra space and an extra time dimension to this, but constrained by the guage symmetries that give Heisenberg's uncertainty principle, leads to holographic principles that explain the connection between electron orbits round an atom with the expansion of the universe, and between quantum chromodynamics and the lack of evidence of anyons (NS, 13-Oct-2007, p36).
With string theory, the extra dimensions are assumed to be stunted, or curled up, into less than a planck length. With Braneworld, they are assumed to be fully-fledged dimensions, of which our 3+1 dimensions form just a membrane (NS, 29-Sep-2001, p26).
Perhaps, gravity is the weakest of the four fundamental forces because it is the one that 'leaks' most readily into the extra dimensions (NS, 14-Mar-2009, p38).
There are many completely different types of parallel worlds theory, multiverse theory, and many worlds theory (NS, 21-Jan-2017, p28). Most versions would have implications for free will (NS, 27-Sep-2014, p32). Smolin argues, though, that they are just devices for handling our lack of knowledge about the universe (NS, 17-Jan-2015, p24).
Tegmark presents a four-level classification (NS, 26-Nov-2011, p42).
Wiseman proposes a "Many Interacting Worlds" model (NS, 08-Nov-2014, p6), in which the behaviour of the quantum mechanical system is the blurred average behaviour from several universes (of the order of 41) that interact fairly strongly with each other.
Rovelli and Smolin proposed Loop Quantum Gravity (LQG) based on spin-networks (NS, 22-Jan-2005, p33). The subatomic particles could be caused by vibrations in the granuals of space-time at various modes, just as in string theory.
Experimental tests are proposed that might determine whether space-time is quantised, and at what granularity (NS, 07-Mar-2015, p12; NS, 15-Aug-2009, p26). Others are proposed, to look for astronomical evidence of black-holes that collapse to the quantum loop size, and then rebound as a white hole at a characteristic frequency (NS, 02-Jan-2016, p32).
In effect, LQG replaces the notion of a top-down, external, all-encompassing framework of space-time coordinates with a bottom-up, nearest-neighbour, local interface between atomic granules of space-time. This implies that it is working something like a cellular automaton (NS, 21-Jun-2003, p32; NS, 06-Jul-2002, p46), with nearest-neighbour communications, working on simple, local rules whose amassed behaviour (summed over huge assemblages) would approximate to our familiar laws of physics. For example, a photon entering one granule on one side and exiting on another might be described by some operation, newcell=photonpassage(oldcell), and so the beam of light, and our Euclidean geometry, would end up as some averaged value obtained by integrating over lots of instances of:
There are indications (NS, 11-Mar-2017, p28) that LQG and string theory could be compatible, not least at the two-dimensional boundary of a holographic projection. It could be that the string length is somewhat bigger than the granual size, with string theory being supported on a fine mesh of these granuals, thereby explaining why string theory sees space-time coordinates as a fixed background (Smolin 2001).
Along with Causal Dynamical Triangulation (CDT), in which the granuals of space-time are tetrahedrally, nearest-neighbour connected in an (n-1)-dimensional topology (NS, 14-Jun-2014, p34), Quantum Einstein Gravity, Quantum Graphity and Internal Relativity (NS, 03-May-2008, p28) allow a connectivity that can vary between infinite (thereby doing away with the need for Inflation as an explanation), to 3+1 (thereby creating normal space-time) but reducing to 2 on small scales (perhaps hinting at a holographic principle).
Markopoulou suggests that all of the fundamental particles might consist simply of qubit-like braids of space-time (NS, 12-Aug-2006, p28), which might then explain why the universe appears quantised (NS, 10-Nov-2001, p40), and hence the significance of knot-invarient properties (NS, 18-Oct-2008, p32). Entanglement, too, might be explained by topological properties of the fundamental particles (NS, 08-Jan-2011, p10). There is even a suggestion that the long-sought proof of the Riemann hypothesis, in mathematics, with the zeros of the zeta function for prime numbers all lying on the vertical line 0.5+n.i (NS, 22-Mar-2008, p40), might be found first in the states of a suitably chosen quantum system, such as an atom or molecule (NS, 11-Nov-2000, p32), with a possible connection, too, with the Schanuel conjecture (NS, 21-Jul-2007, p38). Random-matrix theory might be used as a tool (NS, 10-Apr-2010, p28), as might the same sort of deep-learning program that was applied to playing Go (NS, 28-Oct-2017, p36; NS, 18-Feb-2017, p12).
Using amplituhedra as a sort of multidimensional version of Feynman diagrams (NS, 29-Jul-2017, p28), giving the results of calculations in quantum chromodynamics equal to the multidimensional polyhedron's volume. Not only does the method generate tractable and correct results, but it suggests that 'locality' is an emergent feature. Unfortunately, at present, the tool only works for super-symmetric quantum mechanics.
Both relativity and quantum uncertainty agree that there is no such thing as simultaneous. However, they tend to split in two camps: eternalism (all space-time exists, as a static block) versus presentism (NS, 03-Jun-2017, p44).
Talking about time, though, runs up against the impossibility of our attempting to think what anything might be, outside of time or space (since, following Descartes, to think involves both time and material matter). Ultimately, it becomes tied up with the question of why there is something, rather than nothing, and indeed what it would even mean for there to be nothing (NS, 08-Oct-2016, p52). One encouraging possibility, though, is that paintings and sculptures are human artefacts (and therefore manifestations of thoughts and story-telling) that are static structures, outside of time (as opposed to music, literature, dance).
One possibility as to why the universe started off in such a low entropy state, for example, is that, to us, looking from inside the universe, the moment of low entropy state simply appears to be the start of the universe, and of local time (even if it wasn't) as a direct consequence of our own definition. Just like the characters in a movie film, we cannot tell if the film is being run backwards in the projector. Moreover, the actors can rehearse, and even do the takes of the scenes, in any arbitrary order, stitching the characters' chains of thought together mentally, in their minds, and physically, in the cutting room.
Arguing in the static, such as using Symplectic Integration (NS, 19-Mar-1994). For-all array-operators (APL, cluster states).
It is suggested that time must be relative, rather than absolute (NS, 08-Oct-2011, p37). Indeed, there is a question as to whether time or space, or both, are derived, or emergent properties of something more fundamental, such as is offered by LQG (NS, 15-Jun-2013, p34), and as to why there are three dimensions for space (NS, 28-Sep-2013, p34).
Any theory that attempts to explain the manner of the arrow of time 'arising', entropy 'growing', and the 'period' 'when' all this did so (NS, 16-Jan-2016, p8) is, on the face of it, using self-contradictory time-dependent terminlogy. It implies some hypothetical absolute time, outside our universe, that is distinct from, albeit parent to, the local time that we experience within our universe. All such concepts therefore make some direct or indirect reference to dtlocal/dtabs.
Smolin points out (NS, 23-Sep-2006, p30) that, with relativity, by abstracting out time as just being another dimension, it means that all the physical laws become constant, invarient, and outside of time (NS, 20-Apr-2013, p30), which is strange for a universe that has existed for only a finite time (NS, 22-Nov-2008, p32). Moreover, such a view leaves us with no way of explaining why we have the concept of 'now', and indeed of past, present and future. Roveli argues for the emergence to have occurred via thermal-time, in which it is not reality that has a time flow, but our approximate, lumped-parameter statistical knowledge knowledge of it that has the effect of a time flow (NS, 19-Jan-2008, p26).
Entropy puts a direction on the direction of transfer of energy, and Fourier's, Newton's and Stefan's laws of cooling start to put numbers on the throughput. Meanwhile, the velocity of light limitation of special relativity puts numbers on the latency of that energy transfer.
According to the "electromagnetic arrow of time" view, it is the speed of light that is nature's way of making sure that things do not all happen at once (NS, 04-Mar-2017, p28). With a movie film of a man passively sitting on a swinging swing, there is the finite speed at which cause is followed by effect, and at which various waves travel (as for example, when visual clues come before the audio ones).
As a result of Laplace's aberration, stars orbiting round the centres of their galaxies, and galaxies orbiting round their local groups, do not have the same view of gravity as spacecraft going up to service the ISS.
Even the Newtonian laws of motion encounter problems, when taking causality into account. When a decision is made to throw a ball, or to pull a car away from a set of traffic lights, position, velocity and acceleration all suddenly change from zero to some positive value, as do all the higher derivatives, with a danger of some of them might suddenly incur a step function, with an infinite slope for the next derivative up. Normally, the universe avoids this by allowing all the changes to be blurred, allowing a continuous ramping up of each of them, with no step changes. However, because cause must always precede effect, none of the blurring can occur before t=0. Interplanetary transport networks (NS, 25-Mar-2006), but applied to all fields, not just gravity, might offer one way into this, inasmuch that it allows situations where all the derivatives of each of the alternative paths can become equal (indeed, zero) at the saddle points.
Under the "thermodynamics arrow of time" view, it is the increase in entropy that gives our real sense of the direction for the flow of time. However, living cells contrive that, in our daily experience, entropy is constantly being reduced.
The one-way flow of time seems to be connected to our sense of something running out, or being used up. Ultimately, it is our ageing bodies (telomere length, or other measures of the biological machinery wearing out) that we notice, but even before then there are many local events, like Summer days with increasingly shorter daylight hours, or the running low in the fuel tanks, or TV games shows whose rules are integrated with the seeds of their own destruction. In any case, we cannot measure entropy directly, but rather some more material quantity that bears a roughly linear relationship to the entropy. One example is a burning candle, marked off in hours, which literally measures the quantity of hydrocarbon molecules not yet turned to carbon dioxide and water molecules, but indirectly measures the amount of order not yet converted to disorder. Moreover, the monotonic function can be of either type, increasing or decreasing, as in the water clock whose read-out can either be on the top reservoir (how much water is left) or on the bottom one (how much space is left, with hints of water-molecule holes, similar to their semiconductor equivalents, that somehow transfer from the bottom reservoir to the top one).
If the man on the swing spills a box of matches, for example, each match is completely unchanged, and could have its trajectory reversed. Similarly for gas molecules in a poorly confined volume.
The eternalism (block universe) view is tantamount to considering time to be just another spatial dimension, w, and hence as one of four dimensions (w,x,y,z), like a reel of celluloid movie film that has been cut up into its individual frames, with all the frames piled up in sequence. The frames represent just two spatial dimensions, x and z, with the position in the pile, w, used to represent time (with the third spatial dimension, y, implied by the use of perspective within the frame). We can contemplate how the positions and shapes of objects change in this pile of cine frames. We could trace the movement of a sugar cube as it first enters into the scene in a sugar bowl, then is lifted and dropped into a cup of tea. Ultimately, though, the beginnings and ends of any object only make sense if we are tracing the paths of the constituent fundamental particles (which are not only atomic, but also dimensionless). Macro objects, such as sugar cubes, are then just akin to shoals of fish or murmerations of birds.
There are constraints on the shapes that can exist in the static space of (w,x,y,z). Stable patterns within (x,y,z) tend to extend further in w than unstable ones. When computing the entropy within any closed region in (x,y,z), it will be a function that is monotonically dependent on its extent in w. For instance, molecules of gas confined to one of two adjacent chambers on the x-axis, when the partition is suddenly removed, must occupy greater ranges of x values for increasing values of w. However, it is not a linear relationship, and instead must reach the x values asymptotically. Ultimately, all particles in the universe have asymptotic values back towards the Big Bang, forward to the current value of w, forward to the end of the universe, and back again towards the current value of w. This gives two distinct periods: the past from the Big Bang to the current asymtotoic value, and the future from the current asymptotic value to the end of the universe, and hence a concept of 'now' at the interface.
A human would appear as a sort of "worm" in the block universe, with the baby at one end, and the corpse at the other (NS, 02-Nov-2013, p34) whose thoughts would be manifest as positional relationships in the static structure (NS, 22-Nov-2008, p32) in an analogous way those of a painting.
Perhaps an extra dimension could be involved, perhaps a stunted one that only has two possible states: 0 or 1 (past or future); or, perhaps this Boolean information could be encoded in the states of the six curled up extra dimensions that string theory hypothesises (thus once again making the model a static one in ten dimensions, but still dynamic along the time axis when viewed from our perspective from within). The concept of 'now' could be simply the contents of our three spatial dimensions (complete with their measure of entropy) at their point along the time dimension where these extra dimensions undergo the sort of phase change between past and future that Ellis' proposal seems to imply.
Perhaps the past and future are different phases of the universe, and that 'now' is a wave front of phase-change, as it traverses the spatial dimensions of the universe, and that we, by definition, live out our experiences entirely on this wave-front. Even in digital electronic computing, the program counter of the conventional processor captures the concept of 'now', and is perhaps a reason that 5G research failed to catch on beyond the 1980s.
Dowker proposes, using causal set theory, that the flow of time is caused by the expansion of the universe, and a measure of the new qubits that are thereby being added (NS, 04-Oct-2003, p36). This new space is presently being created, at the rate of 69.3 km/s for every 3.26 million light years, which equates to 2.25x10-18 m/s per m, which is a 1/800th of the diameter of a proton per second per metre, or 1.39x1017 planck lengths per second per metre.
Muller (2016) proposes that as space-time expands, it is the new coordinates of time that feel like 'now', and give rise to the feeling of the flow of time. 'Now' is simply the raw face at which new instances of time are being created as space-time expands, rather than some notion of the increase in microstates (caused by the expansion of space-time) for entropy to expand into. There is then just the problem of defining what it means for new instances of time to be added; to be added implies that there is change, but change with respect to what? Moreover, none of this would not be applicable if expansion is not happening in our part of space-time.
From our perspective, time appears to be influenced by (and ceases to have its familiar properties) in the presence of intense gravitational fields, though it is more accurate to say that it is the other way round, and that affected time is what we perceive as gravity.
In loop quantum gravity, when the universe is compressed, the speed of light takes on an imaginary value, and time becomes a fourth space dimension; and hence suggests that the emergence of time was caused by the breaking of this symmetry.
In any case, crossing event-horizons is a monotonic process.
Gurzadyan suggests that the arrow of time might be a simple consequence of the curvature of space, and that this would avoid the need to postulate a period of inflation; unfortunately, though, this would only work if the curvature of space-time were negative (NS, 15-Oct-2005, p30). Susskind also suggests that a universe that had to tunnel through the string-theory landscape would also lead to it having a negative curvature (NS, 02-May-2009, p35).
An overall negative curvature could indeed be what has resulted from the large voids that have formed (NS, 15-Nov-2008, p32), due to clumping of the galaxies over the most recent 5 billion years (NS, 24-Nov-2007, p34) as a back-reaction to space telling matter how to move, and matter telling space how to curve, and that we live within such a void (NS, 18-Jun-2016, p28; NS, 08-Mar-2008, p32). Importantly, this might explain the results that are currently attributed to dark energy, thereby avoiding the need to propose the existence of such a thing. Indeed, the problem with assertaining the actual curvature is that it becomes a chicken-and-egg problem with assertaining the expansion (NS, 01-Aug-2009, p40). There is also a possibility that mass could distort time a space differently (NS, 24-Oct-2009, p8).
Not just the curvature of space-time, but the acceleration of expansion, and also the maximum speed limit should all be possible to derive mathematically, complete with its own non-communitive properties, without reference to light, mass or energy (NS, 01-Nov-2008, p28).
Bekenstein showed that the surface area of the event-horizon of a black-hole corresponds to the zeroth law of thermodynamics. Taking such a view, though, leads to a problem with the disappearance of information when matter enters the black-hole, and its subsequent reappearance as Hawking radiation (NS, 04-Feb-2017, p16; NS, 27-Jul-2013, p10), and with it perhaps still being available to forensic-science investigators at the event-horizon (NS, 19-Sep-2015, p11). So, the event-horizon would be too big to hold all the information, since the information is disappearing from the outside viewer, leading to implications of there being a so-called firewall at the event-horizon, (in contravention of relativity, which holds that there should be no detectable landmarks at that part of the curvature of space-time) or an increased speed of light inside the black-hole (NS, 06-Apr-2013, p38). Carroll shows that the many-worlds interpretation could resolve the firewall paradox since it is in the many-worlds that the information content is conserved, not in the individual branches (NS, 06-Jan-2018, p14).
One attempt to resolve this is to propose that the whole universe is a hologram (NS, 17-Jan-2009, p24; NS, 27-Apr-2002, p22), and that reality is somehow lived out, in time, distributed round a two dimensional surface. Such possibilities as the universe being holographic might be detectable, via the consequent violation of Lorentz symmetry, if space-time turns out to have a preferred 'weave' in specific directions (NS, 16-Aug-2003, p22).
Perhaps black-holes could harbour bubble universes (NS, 09-Jan-2016, p8) along with all the information that they contain. Moreover, it could be that the universe is filled with primordial black-holes (where the information is stored holographically in their event-horizons), but with those black-holes at varying densities because of Heisenberg's uncertainty principle; our habitable part of the universe would then be in one of the low density (and hence low entropy) regions, where radiation is able to permeate (NS, 28-Apr-2007, p28).
Maldacena's Anti-de-Sitter/conformal field theory (AdS/CFT) correspondence, which permits a hologram-like conversion between a five-dimensional string-theory space with gravity and a simpler four-dimensional space (NS, 12-Oct-2013, p36), also would only work if the curvature of space-time were negative (NS, 30-May-2009, p34). It might, though, offer a way to unify the space-time view of relativity with the particle-field view of quantum mechanics (NS, 04-Feb-2017, p28). The AdS/CFT duality leads to a suggestion that information has an interpretation in terms of space-time (NS, 07-Nov-2015, p30) and that space-time arises as a conseauence of entanglement in the quantum field. Cao and Carroll have shown that it might be possible to make this work for a universe that has a flat geometry, too, if there is a conservation law for the amount of entanglement in a given volume of space-time (NS, 23-Dec-2017, p10).
Fields (and hence, particles) are in states, and it is states that can be entangled. Decoherence is the gradual blurring, with time, of a quantum system's boundaries as its poor isolation from its environment cause other superposition states to form with the environment's states. Wave-function collapse is the conversion of the several eigenstates, of a quantum system in superposition, to a single eigenstate, with the triggering event said to constitute an observation, or measurement. Decoherence is the overwealming of wave interferences that had been in place; while wave-function collapse is the increase of one of the probabilities to one, and the dropping of all others to zero.
According to the Copenhagen interpretation, the interaction with an observer causes the wave-function to collapse, and that therefore the observer is an integral part of the experiment. This leads to many disconcerting possibilities, not least the existentialist one that reality is not objectively present. Furthermore, the use of the word 'observer' raises the possibility that conscious beings have to be involved (NS, 02-May-2015, p33), which is not only disconcerting, but problematic, since we still do not have a definition of consciousness, even now, after several millennia of investigation (NS, 04-May-1996, p20).
Wootters proposes a physical manifestation of what we presently handle in mathematical models of quantum mechanics using the square-root of minus one, to account for the loss of information at wave-function collapse (NS, 25-Jan-2014, p32).
Causality ceases to apply when there is entanglement (NS, 03-Aug-2013, p32), and decoherence occurs when some of the information leaks out from a system that is in superposition.
Dark energy might be explained by the creation of information by wave-function collapse (Sudarsky, Josset and Perez). Decreasing information implies increasing entropy. The arrow of time might then be due to the particles of the universe becoming ever more entangled (NS, 04-Feb-2017, p31).
Entanglement in time (NS, 27-Aug-2016, p12) consists of a particle becoming entangled with an earlier version of itself (NS, 27-Mar-2004, p32).
Experiments have shown the so-called quantum pigeon-hole effect (NS, 02-Aug-2014, p8). Experiments have also shown the so-called quantum Cheshire-cat phenomenon: that a particle can follow one path, and its properties (such as spin) can be split off to follow a different path (NS, 26-Jul-2014, p32).
In asking what reality is, either we define it in terms of objective things, stuff that can be sensed and known to be out there, or we can try a reductionist approach of trying to build everything on some lowest level (NS, 29-Sep-2012, p34); but both approaches have their problems. Both approaches end up going round in a circle, with macroscopic objects (not least human brains) at the end: in the former approach, consciousness is the property that causes wave-functions to collapse one way or the other, and not remain in superposition; while in the latter approach, reality is based on a substrate of mathematics (NS, 15-Sep-2007, p38; NS, 21-Nov-1992, p36) and Tegmark's 4th level of multiverse, and the debate as to whether mathematics is invented or discovered (NS, 02-Sep-2017, p30).
It seems there is a knozn, bounded periodic table of all the symmetries that are possible in mathematics (NS, 14-Jun-2008, p38). Within this, Lisi suggests how the sub-atomic particles can be mapped on to a 248-vertex E8 pattern in 8D space (NS, 17-Nov-2007, p8).
Spontaneous (objective) wave-function collapse could occur (NS, 16-Jul-2016, p30), with no observer required (proposed by Pearle, Ghirardi, Weber, Rimini in the 1970s, and made compatible with general relativity by Bedingham and Tusmulka).
Heisenberg's original explanation for the uncertainty principle, that the act of observation involves bouncing at least one particle off the object that is being observed, thereby disturbing it, was somewhat undercut by the experimental verification of Bell's inequality, but it does serve a second purpose of being a possible explanation for the triggering of wave-function collapse. That is, interaction with the noisy environment causes wave-functions to collapse, and for particles to become discrete objective entities.
Perhaps the ripples of passing gravitational waves affect macroscale-sized objects, but have little affect on subatomic scale ones (NS, 21-Nov-2009, p12). A black-hole swallowing of information can be explained, with collapsing of wave-functions inside the event-horizon needing no observer. Penrose suggests (NS, 09-Mar-2002, p26) that it is not a coincidence firstly that it is when the effects of gravity start to become noticeable that the Standard Model starts to break down as a workable approximation of how the universe works, nor secondly why gravity is the one force that resists being unified with the other three. Indeed, it could even be because of gravity that quantum entanglement experiments are so difficult to perform on Earth (NS, 20-Jun-2015, p8). The quantum mechanical uncertainty of the energy of a particular region of space, with a given uncertainty of time over which it persists, would translate, relativistically, into an uncertainty of the curvature of that region of space, and it is perhaps this non-linearity that collapses any wave-functions that are in a state of superposition. superposition requires a linear behaviour of space-time, but general relativity puts a curvature on it as soon as you place a mass, like a particle, in it. So, it is a question of when space-time starts to look non-linear to the quantum mechanics equations. Moreover, when a massive particle is in a superposition between two places, it must have implications for the curvature of space at those two locations (NS, 03-Jan-2015, p26) and anything that is being attracted to the particle in either of those two locations; similarly for the relativistic time-dilation experienced by a particle in a superposition of two velocities causing the two instances to experience the passage of time differently. Conversely, gravity might be the effect of spontaneous (objective) wave-function collapse (NS, 23-Sep-2017, p8). Tegmark argues that gravity plays no part in wave-function collapse, since gravity is only an optional interpretation on one side of the AdS/CFT duality, and that gravity can be taken out of the equation (albeit not in our universe but a hypothetical one with negative curvature). It might still resolve the EGB (spooky action at a distance) paradox. The paradox of quantum monogamy (no more than two particles can be in the same entangled state at the same time) between three particles (one on each side of a black-hole's event-horizon, and the other at the other end of a wormhole) can be resolved by noting that the one at the other end of the wormhole is in the future, or past, of the other two. The particle at the other end of the wormhole is never at the same time as either of the other two, though it might be at the same space as one of them. Perhaps it would be possible for the machinery that we have for Pauli's Exclusion Principle to be adapted to describe a similar-sounding principle of quantum monogamy.
The ability to add a dimension is what allows the AdS/CFT duality to add gravity to the standard model (NS, 11-Feb-2017, p24). The resolution of the search for the 3D Ising model, via a bootstrapping approach, could solve many problems concerned with phase transitions, many-body strongly coupled systems, superconductivity, and quantum mechanisms of the strong nuclear force and the AdS/CFT duality (NS, 18-Feb-2017, p28).
Gell-Mann and Hartle talk of "information gathering and utilising systems" (IGUS), with possible implications on free will (p454; NS, 01-May-2014, p34) and how it is not the observer (or the act of observation) that creates reality (p453). Multiplication by the off-diagonal zeros, implicit too in the And and Or operations of manipulating proabilities (p441), is an irreversible operation (p452). It comes down to how (a+b)2≠a2+b2 (p428), and how, under superposition, the probabilities might not add up to unity. All of the universe is interconnected, by the off-diagonal elements (p431) and all laws in physics are approximations (p445) but the authors seem to be against the notion of loop quantum gravity (p430). History requires knowledge of both present data and the initial condition of the universe (p439) and results in the notion of time (p437) via its ordering. This leads to the mechanism of Heisenberg's uncertainty principle (p455) and how the many worlds view should really be thought of, instead, as many histories. There is a chain of logic leading from decoherence, to resonance, to Dalton chemistry, and to a mechanism of survival of the fittest (p449). Patterns will crystallise out, but, as with evolution, because of the randomness in the process to get there, we cannot predict, in advance, which ones.
Each collision nudges the matrices to be in nearly diagonal form; nearly, but never completely. Paul Davis notes the non-locality of quantum mechanics (with all the universe having some effect on all the rest). He also notes that an accelerating or rotating body should experience the glow of the quantum vacuum (NS, 03-Nov-2001, p30), perhaps contributing to an explanation for the apparent correspondence between inertial mass, gravitational mass, and the other three fundamental forces (NS, 03-Feb-2001, p22).
Ever since the universe was more than 5.39x10–44 seconds old, the universe appears to have adhered to all our current laws of physics. However, since the laws of physics of the universe must reside inside the universe, not as some abstract concept outside, they can only be as precise as can be calculated from the total information content of the universe. Allowing our theories to work with an infinite number of numbers, quotable to an infinite number of decimal places, shows that mathematics is an approximation of the universe rather than the other way round (NS, 17-Aug-2013, p32).
Meanwhile, the phenomenon of quantum post-selection (NS, 30-Jun-2007, p18; NS, 30-Sep-2006, p36), has now demonstrated over the 3500@nbsp;km distance of a transmission to a satellite and back (NS, 04-Nov-2017, p12). Moreover, wave-function collapse has been shown to be not an instantaneous event, but one that takes time (NS, 10-May-2003, p28), and by performing only weak measurements each time, can therefore be interrupted or even reversed (NS, 12-May-2007, p32).
From our perspective, there is no definite history of the universe. Using the present state of the universe as the input to infer its origins is more valid, and certainly more practical, than attempting to obtain the present state of the universe as the output (NS, 22-Apr-2006, p28). Indeed, the post-inflationary state of the universe might be derived from a Feynman diagram analysis of the 10500 initial states that string theory implies (NS, 28-Jun-2008, p10).
Since the universe's information content is limited by its size, in the early universe, maybe it was more amendable to the retro-causality of quantum post-selection, for it to be fine-tuned for the conscious beings that would eventually evolve to observe it. Quantum post-selection already gives a variant of the anthropomorphic principle, but might this extends it even further to some sort of quantum Darwinism, acting so that the fittest wave-function collapses predominate, and give the impression of there being an objective reality to be measured by independent observers.
Deutsch and Marletto propose a way in which the laws of physics arise as 'constructors' that work on information (NS, 24-May-2014, p30). Not only would this offer an explanation for how the laws of physics arise, but it could also give an explanation of what constitutes knowledge (NS, 01-Apr-2017, p30), and the role of a knowledge creator (like a conscious mind).
Power is the rate of change of energy in time, and, by the same token, force is the rate of change of energy in distance; but force is also rate of change of momentum in time.
The lattice has been arranged so that by starting at any given node, travelling SW involves taking the d/dt, travelling SE involves taking the d/dx (and vice versa with integrals for travelling NE and NW).
In the lattice: P=power, E=energy, F=force, p=momentum, m=mass, L=action, mx=mass*distance, X=dp/dx=dm/dt, Y=dm/dx and Z=∫L.dt=∫m.x.dx.
Measurement of entropy is connected to the question of whether one particle can be substituted for another, and comes down to the number of degrees of freedom. Electrons, Up quarks, and Down quarks, have no internal structure, and can be substituted for each other without any observer noticing. Next, although protons and neutrons are made of a mixture of Up quarks and Down quarks, with various permutations of red, green, blue possible, confinement means these internal data are transparent to our experiments (NS, 04-Dec-1993, p28). Next, with Bucky balls, it seems that C60 or C70 molecules still has no discernible landmarks (if one carbon atom were to be substituted by another, for example). For the DNA in a small virus, though, it is not so transparent; discernible landmarks, and hence an entropy to its structure, and hence a memory. The difference between sending a Buckminster fullerene molecule or a small virus through a double-slit experiment (NS, 17-Mar-2007, p36; NS, 15-May-2004, p30) is that the latter has more of an odometer or local clock, with increasing local entropy, as its DNA sequence becomes degraded (by methylation, telomere damage, or simple non error-corrected base-pair swapping). No-one has yet obtained interference fringes by firing small virus DNA molecules through a double-slit experiment (NS, 09-Mar-2002, p26). There can be a seething chaos within, but a billiard-ball-like, black-box simplicity outside (such as the way we view the sun as the centre of our solar system); and it can be experiencing the passage of time within, but apparently timeless when viewed from the outside.
There is a distinction between things that have memory (like human brains and biological genomes), but also including objects that merely persist (like rocks and rivers), and those that do not (such as momentum, force, energy and power, that are functions of the present state of the objects concerned, not of their pasts). Memory is key to an object's identity, whether it be a particle passing through a double-slit experiment, a human who thinks (and therefore who is), or the genome of a species. Also, it is the act of erasing a memory bit that incurs the thermodynamics cost.
Back in the 1990s, it was found necessary, for security reasons, to lock all university PC hard-drives to be read-only. Students would carry their work around with them on floppy disks. The floppy disk could also carry other, meta, information, like the user profile. The students were able to work on one PC, advance their work a bit, save it to floppy disk, go off to a couple of lectures, then go up to a different PC, and resume their work, with their own personalised profile, on a completely different PC, as if nothing had happened inbetween. This demonstrates how memory is the key to identity. The students' computing identity did not reside on the PC (worth several hundreds of pounds) but on the floppy disk that they carried around with them (worth a few pence). Similarly, my identity with the person who ate breakfast this morning is firm because of the memory of eating that breakfast. Conversely, it does not matter if someone claims that I am the reincarnation of some medieval knight; if I cannot remember anything of my past experiences (NS, 22-Apr-2017, p28), it is not much of a reincarnation.
Time-travel would only be possible if eternalism is right, and presentism is wrong (NS, 03-Jun-2017, p44).
There are many paradoxes surrounding time-travel, and many attempts at proposing resolutions to them (NS, 28-Mar-1992, p23) assuming time-travel to be possible (NS, 20-Sep-2003, p28) and noting that relativity is somewhat ambivalent on the prospects of time-travel (NS, 20-May-2006, p34). Hawking proposes a chronological protection conjecture (NS, 08-Oct-2011, p50), and some argue that the second law of thermodynamics could give the answer (time-travel is possible, but must cost the time machine at least as much energy to run as that needed to achieve the entropy change that the time-travel brings about).
The problem of the speed of time, and the notion of dtlocal/dtabs, is also encountered when considering a discussion between of two hypothetical time-travellers comparing their two time machines. How could one traveller say to the other, "My machine is better than yours: it can go at one century per second, but yours can only go at 50 years per second"? Whose seconds would they be talking about? Intuitively, one would expect those seconds to be some sort of perceived seconds, within the passenger compartment of the respective machines, perhaps as indicated on their respective wrist-watches, or rather, by their various body clocks, and thence by some measure of local entropy increase, numbers of bits being erased, and hence of memories being recorded, and of histories being laid down (including that of having eaten breakfast that morning).
A time-traveller, going to the future and back would be gaining knowledge and experience, and clocking up seconds on the odometer, just as a car's kilometerage keeps clocking upwards despite never straying from the daily commute. A molecule (of sufficient complexity) in a sound or fluid wave would similarly clock up more on its odometer than in its displacement (odometer is to displacement as speed is to velocity, perhaps with a notion of some sort of Strouhal number).
There are questions as to what it is that gives the feeling of the passage of time (NS, 04-Feb-2017, p31). Ellis notes that none of the present physical models capture our feeling that time flows (NS, 02-Nov-2013, p34), and Gisin adds that none of them capture our feeling of having free will (NS, 21-May-2016, p32).
The notion of time flowing does still beg the question, of course, of flowing with respect to what? In the case of thought experiments involving astronauts passing their twins at close to the speed of light, or involving a trip close to the event-horizon of a black-hole, the rate of flow is comparative, between two human observers. However, we also use the term about our own, everyday sensation of the passage of time. One argument is that it is our sense of identity, that the person who started reading this sentence is the same as the person who ate breakfast this morning: one thing that we could be measuring the flow of time against is our laying down a permanent and ever-growing trail of past memories. It is not unlike the illusion created by our image in the mirror being based in left-right symmetry, and not up-down symmetry. It is further supported by a proposal (NS, 15-Aug-2015, p26) that consciousness is just an illusion that the subconscious brain concocts (NS, 07-Jul-2007, p36), compatibly with Libet's experimental observations, as an aid for the survival of the species (thereby also explaining phenomena such as pantheism, phantom limbs and post-event rationalisation of otherwise odd behaviour). Perhaps free will just means deterministic behaviour, but dominated by internal interactions within the system (along the lines of Tonino's integrated information theory).
It seems that time is brought into existence in some concocted, internal view (NS, 08-Oct-2011, p41). The equation of an ellipse is x2/a2+y2/b2=1, which is static and outside of time, but can equally be expressed in parametric form, as two equations, x=a.cos(θ) and y=b.sin(θ), or even as x=a.cos(ωt) and y=b.sin(ωt) where ω is a constant, with implications of a time-evolving orbit, for example. It seems that a non-collapsed wave-function view of a planet in a superposition of all points around its orbit is outside of time, while the wave-function collapsed state of it being in one particular point is contributing to the definition of time. Not only does the wave-function collapsed state require more information to describe it, to represent the particular way in which the symmetry has been broken, but entropy is then forever increasing (or at least not decreasing) as the planet continues round its orbit.
In the block view, it would trace out a helix in the stack of movie frames (before curvature of space is taken into account). In the presentism view, it would be a function of another parameter, t(S). Both cases need a boundary condition (which movie frame counts as position-0, or what value of S constitutes the start state). While there is no change in entropy, there is no feeling of time passing. Then, putting in a second object, in a different orbit, this still creates two featureless, continuous sine waves, except that this now becomes a three-body problem, with each body perturbing the other two. The system state can be summed up in a handful of bulk parameters, but with an uncertainty on each (on top of the usual ΔE of Heisenberg's uncertainty principle).
Unrelated parts of the universe are not governed by an external, global clock, so their values of 'now' are uncorrelated. Via Newton, we have invented functions that are paramaterised in a mythical global time, t. It is only when the parts encounter each other, that their values of 'now' (which, in any case, should start as a fuzzy, extended-now) become the same, somewhat akin to the way that two functions f(x) and g(x) can have all sorts of uncorrelated values except after stipulating the point where f(x)=g(x) and solving the simultaneous equations that result. (Particle collisions that synchronise under the clock at Waterloo station at a given time and date.)
The ratchet of the second law of thermodynamics; monotonicity and monotonic functions. Things align with df/dw>0, which we think implies dS/dw>0.
Turing's Halting Problem indicates that we cannot just treat computer programs in the abstract, as a static whole, stored on paper, a CD-ROM or a deck of punched cards; it is necessary to state the specific data that are to be supplied, and to consider the dynamics of its execution (passing under the read-head of the debugger's finger on the printout, the program counter, or its equivalent in lambda calculus, when run sequentially). Similarly, a story cannot be treated in the abstract, as a static whole, in a book, DVD or stack of cine frames; it is necessary to consider the specific data that it will be immersed in, and to consider the dynamics of how the story is to be told (passing under the read-head of the page-turner, DVD player or cine projector). And, when those stories are not just biographies, but human lives, we know that we have to consider the read-head of the moment that we call 'Now', and how they can only think consciously in a singled sequential thread).
Fourier analysis gives Δk.Δx=1/2 as an inheritant limitation from the mathematics (the more fine and precise a pulse is in time, the wider the band of frequencies needed to define it) where k is the wavenumber, or the spatial frequency, equal to 2π/λ. Then, with a simple substitution of E=hf, Heisenberg's Uncertainty Principle drops out: ΔE.Δt≥ℏ/2, Δp.Δx≥ℏ/2 and ΔL.Δθ≥ℏ/2.
One study suggests that entanglement is the limiting condition (NS, 21-Aug-2010, p33), and another that Heisenberg's uncertainty principle and entanglement are, in fact, two sides of the same coin (NS, 30-Apr-2011, p28): either one applies, or the other, depending on the nature of the experiment (but with H.U.P. as the overarching principle).
Since the wave-function of any fundamental particle, such as a photon or an electron, can be considered to consist of a carrier wave (eikx) multiplied by an envelope (e–a.x2), Heisenberg's uncertainty principle is, in effect, talking about the bandwidth of the particle due to the side-bands of its amplitude modulation.
The table that was introduced in the section on the first split second represents the merged data from three articles (NS, 03-Aug-2002, p28; 05-Jul-2008, p28; NS, 25-Nov-2017, p9). Using a bit of reverse engineering, this seems to indicate that:
which can be rearranged to give values of Tn as a function of tn, or more generally T(t):
where the suffices stand for 'beginning', 'end' and 'now', and:
tb = √(G.ℏ/c5) = 5.39x10–44 s
Tb = √(ℏ.c5/G.k2) = 1.42x1032 K
Te = (ℏ.c3)/(8π.G.k.mU)
te = -2*Te
tn = 1/KHubble = 13.8x109 years
Tn = 2.7281 K
The mass of the observable universe has been determined at mU=1053 kg (NS, 16-Dec-2000, p26). Since its present temperature, Tn, is greater than its Hawking temperature, Te, this means that it is out of equilibrium, and will continue to cool. The question addressed in the 03-Aug-2002 New Scientist article is, "at what time will this equilibrium state be reached?" The answer reported seems to have assumed, perhaps only implicitly, mU=7x1051 kg. The figures above, though, indicate mU=1.58x1053 kg, and hence to te=5.3x1052 years.
Following on from a 1979 paper by Freeman Dyson (NS, 03-Aug-2002, p28) it would at first appear that machines in general (and living cells, and thinking brains, in particular) can continue to eek out a slower and slower existence, forever, right into the heat death of the universe, as the amount of temperature variation gets smoothed out. However, the universe is expanding, with potential supplies of energy constantly going over the de Sitter horizon, out of reach of future generations (NS, 20-Oct-2001, p36).
The monotonic disappearance of potential resources over the de Sitter horizon is shown to be a manifestation of the second law of thermodynamics (NS, 15-Apr-2017, p8). However, expansion equates to the injection of new matter. Meanwhile, the machine of Freeman Dyson will get warm, and will need to dissipate heat as it works, so the Hawking temperature imposes new constraints on the duty cycle of the machine.
So, the measure is tied up with that of measuring divergence (or convergence) from the given point, or conservation of the given quantity, and with the '∇⋅' operator. Kirchhoff's current law is a discrete case example, too; interestingly, this resonates with Kirchhoff's voltage law being a symmetry law. When generalised to a block of metal conductor, as opposed to discrete circuit connections, it involves the '∮' of the gradient. In geography, this would be tracking the height above sea level, as the walker walks a closed path that crosses many contours. The symmetric quantity is the sea level that is being used as the reference point all the way round, calculating the height relative to this based on the gradient at each step. This appears to be approaching a representation in Noether's theorem; in the case of entropy, though, this would leave us with ΔS.Δ(T.t)>=ℏ/2. (where T2.t of the universe started as a constant, but now is falling). Indeed, Rovelli suggests that time, like temperature and entropy, might be an artefact of our way of bulk parameters to allow us to economise on the description of a group of particles (NS, 29-Jan-2008, p28). Certainly, the "(ln(t)-ln(tb))/(ln(te)-ln(tb))" expression, earlier in this section, is suggestive of a type of third law of thermodynamics. Moreover, it is interesting that this brings in close juxtaposition, two laws (Charles' Law and Hubblle's Law) that have similar extrapolative power.
If IU(t)∝(mU)2.t2 and if T2.t was roughly constant in the early universe, this suggests that IU(t)∝(mU)2/T4 in the early universe, suggestive of a connection to Stefan's law.
The first step when encountering a new phenomenon is to start taking measurements; the second is to look for underlying relationships between the parameters; and the third is to propose physical mechanisms that might generate those relationships. Despite a century of intense study, quantum mechanics appears to be still stuck at the second stage, with phenomena characterised by probability functions, extremely successfully, but without any convincing explanation as to the origin of the values that are so produced (NS, 28-Jul-2012, p28), and with a bewildering zoo of potential interpretations (NS, 22-Jan-2011, p30). Wave-functions are either a manifestation of some (ontological) underlying mechanism (possibly right down to the It-from-Bit view), or they are merely human tools or (epistemic) abstractions that guide us to the correct answers (NS, 01-Apr-2017, p41). Recent experiments rule out many classes of the latter view (NS, 07-Feb-2015, p14), and many people believe that there must be a deeper, underlying mechanism that has yet to be discovered (NS, 23-Jun-2007, p30), even though the current proposals are mutually conflicting (NS, 14-Nov-2015, p14). The most obvious Achilles heel of quantum mechanics is our lack of explanation for why the Born rule for probability amplitudes is valid (NS, 05-Nov-2016, p8).
Various attempts have been made to explain how the probabilistic behaviour we observe in quantum mechanics could arise from an underlying deterministic behaviour, including a refinement of the pilot-wave idea (NS, 22-Mar-2008, p28) and a new way of viewing the Bohm interpretation (NS, 27-Feb-2016, p8), and how it might even be supported by an experiment on classical waves in an oil bath (NS, 08-Apr-2017, p28).
Even at the atomic level, it is possible to project an image of an atom from one focus of an ellipse so that a virtual atom appears at the other focus (NS, 08-Jul-2000, p24).
Bell's theorem tells us that, if the results of quantum mechanics experiments are valid, our problems and objections are either with the 'at a distance' (relativity), the 'action' (objective reality) or the 'spooky' (ghost in the machine, free will, consciousness) implications of entanglement (NS, 26-Feb-2011, p36; NS, 03-Aug-2013, p32). For the third of these, it might not necessarily be the observer's free will that is in question, but an inherent limit on our ability to close all the loopholes in our freedom to perform independent experiments (NS, 18-Jun-2005, p32), though experiments have subsequently been performed that significantly tighten up on these loopholes (NS, 05-Sep-2015, p8), including ruling out the possibility that the measurement of the state of one of the particles could tamper with the mechanism of the random number generator (NS, 11-Feb-2017, p7). Experiments have been run to see if the Bell Inequality is affected by whether conscious human minds are involved in the decision as to which parameter to measure in each entangled pair of particles (NS, 27-May-2017, p7). Others questiion the validity of Bell's theorem (NS, 03-Nov-2007, p36), and propose deterministic pre-quantum mechanics theories, for example using hidden variables, or non-commutative parameters.
Experiments consistently suggest that a particle cannot be pin-pointed, at sub-atomic scales, in 6-dimensional phase-space (x,y,z,ẋ,ẏ,ż) because the particle does not have a specific location in that space, with any attempt to locate it, resulting in its position being a blurred one. Unfortunately, all our current methods for solving the equations of motion, from Newton, Lagrange, Hamilton and Jacobi, have, at their roots, the concept of this phase-space, so our ability to use these tools to work out the movements of particles at the sub-atomic scale is somewhat hampered. (In effect, any attempt at using dead-reckoning to predict a particle's future or intermediate position is doomed to failure, one way or the other.) The temporal version of the Bell inequality confirm that, given a particle's initial state and final state, it is not clear that it had any definite history of intermediate states (NS, 04-Dec-1993, p14), as born out in Feymann's diagrams and Gell-mann's many-histories.
Using phase-space as our tool of preference involves imagining everything in terms of momentum-energy, instead of space-time, with curvature in that space leading to the concept of relative locality (NS, 06-Aug-2011, p34).
Noether's theorem agrees that the two parameters in Heisenberg's Uncertainty Principle are connected (the conservation law of one follows as a consequence of the symmetry exhibied by the other) and hence that the two parameters are just two sides of the same coin, so are described by one shared set of information, not two.
Heisenberg, Noether, Fourier and Bell all point in the same direction, that it is not just that we do not have access to all the information, but that the information simply does not exist in the first place (for example of a particle's position and of its momentum), and that our observations are doomed forever to being probalistic (NS, 14-Mar-2015, p28). Quantum nature is perhaps just a manifestation of economy of information (NS, 15-Nov-2014, p28); if a given effect follows from a given cause 80% of the time, it would be inefficient for the system to encode the given cause with 100% coverage.
Perhaps this shared information of "two sides of the same coin" arises simply whenever we try to find partial information about an entangled system, and that quantum weirdness simply emerges from more logical central principles (NS, 11-Apr-2015, p34).
Perhaps the mathematical model of the low entropy start of space-time can be transformed, for example using Noether's theorem, to a symmetrical model in some other mathematical space (Derek Potter on Quora). Kirchhoff's voltage and current laws, in a circuit, become integration of a gradient (∮∇) and divergence (∇⋅) laws in the bulk of a conductor, and similarly for the other fundamental forces.
Wheeler's "It from Bit" (Zurek 1990) proposes that all the matter of the universe is made of information. Analysis of Maxwell's Demon devices suggests that E=S.k.T, where S is measured in nats, or E=S.k.T.ln(2), where S is measured in bits.
If all the matter of the universe is made of information (NS, 17-Feb-2001, p26), the universe itself can be considered to be a vast quantum computer (just as people in the previous industrial revolutions have considered the universe to be like a giant system of wheels, a giant heat engine, or a giant conventional computer). It follows that our ideas for quantum computing will usefully feed back into our formulations for summaries of how the universe works.
Some suggest that quantum computing could beat the Turing Halting Problem (NS, 06-Apr-2002, p24; NS, 19-Jul-2014, p34). For example, Chaitin indicates that there is a hierarchy of Omega numbers that would remain forever non-computable (NS, 10-Mar-2001, p28), though this is later questioned by a demonstration of the computation of the first 64 bits of Omega (NS, 06-Apr-2002, p27). Almost certainly, Quantum Computing will lead to a more general form of the Turing Halting Problem (the uncertainty in the result information, times the uncertainty of the execution time, times the temperature is greater than ℏ/2). Whether any new restrictions count as different, or a mere rewording of the original halting problem, might just be a matter of taste.
There is even a proposal to consider a Quantum Gravity Computer (NS, 31-Mar-2007, p30). In such a machine, output does not necessarily need to follow input. "GR says that the causal structure can vary (since there is no such thing as 'simultaneous'), and QM says that anything that can vary can be in a superposition."
It is possible to consider Moore's law continuing up to the Planck limit (NS, 02-Sep-2000, p26).
Intriguingly, quantum mechanics is positioned at the fine boundary of self-organised criticality, between classical physical behaviour and weird interconnectedness (NS, 26-Feb-2011, p36). This boundary appears to be a pre-condition of interesting (non-intuitive) chaotic behaviour. It features in explanations of consciousness (NS, 26-Apr-2014, p44), and perhaps even of the behaviour of society (NS, 06-Jul-2013, p26).
Conway's Game of Life is mathematically interesting because it is poised on the thin boundary between boring crystalline stasis, and unruly gaseous randomness, in an interesting region of chaotic behaviour. If we were to relax or tighten any one of the game's simple rules, the game would revert to boring uninteresting behaviour. All the interesting things in this universe (consciousness, society's dynamics, evolutionary life, physical properties of water) seem to be those that teeter on this chaotic behaviour boundary between cold stasis and hot randomness. The second law of thermodynamics, and the speed of light limitation, are just two such rules in the game that we know as the 'Universe'. Relax or tighten any of these, and the universe would not work any more, in the way that it currently does; and atoms, stars, planets, and sentient beings would cease to be possible.
Gell-Mann and Hartle describe a mechanism for decoherence, involving matrices that become successively dominated, at each particle-particle interaction, by the terms in the leading diagonals. Thus, it is the repeated particle-particle interactions that lead to the decoherence. Observations and measurements are independent of this, but (as it happens) first require there to have been plenty of particle-particle interactions. Buffeted by the solar wind, Mars is at one specific position in its orbit round the sun, not smeared out probabilistically in superposition all the way round; and even specks in the depths of space are being bombarded by photons of the 3K cosmic microwave background. To have the values of each particle's matrix nudged to new values, at each collision, sounds remeniscent of how the Logistic Equation works (xn+1=r.xn.(1–xn)). The coexistence of symmetry and chaos (NS, 09-Jan-1993, p32) is well studied. Even this simple deterministic equation can go chaotic and, to all intents and purposes, unpredictable (at values of r=3.7, for example), with no implications of spookiness, and an associated temperature compared to that of black-body radiation (NS, 02-Feb-1992, p29).
The adage, "if a hammer is the only tool you have in your tool-box," could cause us to wonder if we should be finding some new tools. Perhaps, at the very least, the experiments are telling us that something is wrong with our understanding, like an experimental version of Reductio ad Absurdum in mathematics. It might be that one (or more) of our axioms is wrong (such as the existence of quarks and electrons, or forces like gravity). It might be that because we design our experiments to investigate the properties of photons, that we appear to get answers that look like the properties of photons (NS, 24-Jul-2004, p30). Toffoli makes a case for why a lot of what we observe in our physics experiments is an artefact of those experiments, or rather our model of what the universe is, rather than intrinsic to the universe itself; he shows that special relativity and general relativity might even be such artefacts.
Rather than our brains analysing incoming signals, finding patterns of ever-increasing complexity, and making sense of them by matching them against the internal representations, it is the other way round (NS, 09-Apr-2016, p42; and also p20): our brains generate the sensory data to match the incoming signals, using internal models of the world (and body), thereby giving rise to multiple hypotheses, with the most probable one becoming tagged, Dennett-like, as being our perception, using on a type of Baysian analysis (NS, 31-May-2008, p30). This could be the mechanism whereby hallucinations are signs of a correctly working brain in the absence of sufficient sensory input (NS, 05-Nov-2016, p28); it also has parallels to the scientific method (coming up with models to explain the observed data, actively setting out to observe new data, and keeping the model no more complicated than necessary) being not so much an invention of past philosophers, as just the way that the human mind has been working, naturally, anyway. Each hypothesis can then be refined in the light of the error signals that are generated. The mere repeated occurence of this process might also give the brain a continuous assurance of its identity (NS, 03-Sep-2016, p33) and to feed into other, what were previously considered metaphysical, questions (NS, 03-Sep-2016, p29). Consciousness might be just a shortcut that has evolved for handling data compression (NS, 25-Nov-2017, p44).
The model of thinking, including deductive logic, that we have maintained since the ancient Greeks, might need to be superseded (NS, 27-Feb-2016, p34). Scientific method might be too stringent, and might need to entertain theories that will always be beyond experimental testing (NS, 27-Feb-2016, p38). Applying Baysian statistics to Popper's criterion leads to the idea scientist actually spend their time building up the weight of confirming evidence, rather than on looking for a single example of contradictorary evidence (NS, 10-May-2008, p44).
Boltzmann brains (NS, 18-Aug-2007, p26; NS, 28-Apr-2007, p33) are self-aware entities in the form of disembodied spikes in space-time (more common in regions of high entropy than low entropy). Sean Carroll presents a counter-argument for their possibility (NS, 18-Feb-2017, p9).
Davies suggests that the laws of the universe are evolving (NS, 30-Jun-2007, p30), and hence that the answer to a question like, 'Why these laws and not any others?' is like asking, 'Why these species and not any others?' (NS, 23-Sep-2006, p30). Hartle considers this the other way round (NS, 01-May-2004, p34), that we and the other species on the planet have evolved to treat 'now' as a special concept because the universe works that way (for example, a force is a function of the present state, with no memory of past or future states). Similarly, free will is also rooted in the present (the past has no flexibility when it comes to shaping the future). However, memory is always involved: as soon as a system consists of more than one component (such as atoms in a crystal, or stars in a galaxy) any probing (such as by a hot body, for example) of the state of the whole system will obtain an almost immediate result from the nearest component, but the time-delayed result from the components further away. The speed-of-light limitation creates delay-line memory into the system, and capacitive laws into the behaviour of the overall system.
The speed of information can be faster than the speed of light in the given medium (faster than the group velocity), albeit not greater than the speed of light in a vacuum (NS, 18-Oct-2003, p42).
According to Deacon (2012), emergentist analysis is to reductionist analysis as bottom-up design is to top-down design. In each case, analysis or design, the two approaches are complementary, and there is a place for both, since there are things that one does that the other does not.
Work is just a form of energy in the equations of thermodynamics, at one level, but takes on centre stage in those equations at the emergent level above (where the phrase "takes on centre stage" is some sort to reference to information and relevence). Information is required to turn energy into work (NS, 23-Jun-2012, p8). It is constraints that are required to turn energy into work, where a piece of apparatus, such as a cylinder and piston arrangement, can be viewed as a catylist (NS, 24-May-2014, p30) and that constraints are measured by Shannon entropy (NS, 12-Aug-2017, p40). Ellis similarly argues for top-down causation (NS, 17-Aug-2013, p28), and Malafouris argues the case of the human cognative prosthesis (NS, 07-Sep-2013, p28).
The chaotic balance between repulsion versus attraction, and frustration versus funnelling (NS, 09-Jun-2001, p32) and constraint versus energy gradient. Deacon has a name (teleodynamic) for systems that build up structure to flow-away the available heat difference as fast as possible, like convection currents, Bénard cells, and braid plains in the sand (NS, 02-Sep-2000, p97); and the other for the next level of complexity up from that, of morphodynamic systems that try to conserve the reservoir of energy difference for their own perpetuation (like living cells).
Consciousness and the others (each one an emergent behaviour) can be treated as a region of phase change (NS, 12-Apr-2014, p29; NS, 26-Nov-2011, p34) which appears related to how the holes in network topology make us smart (NS, 25-Mar-2017, p28) and the use of Algerbraic Topology to investigate the 7-dimensional (or highter) sandcastles that build up and collapse down (NS, 30-Sep-2017, p28) in self-organised critical neural systems (NS, 27-Jun-2009, p34). Emergent behaviour is partly the use of abstraction (for example when passing from physics to chemistry to biology to psychology to sociology) to allow us to handle the dimension of increasing complexity (NS, 14-Feb-2009, p36) but genuinely describing systems with different laws, so that the reverse process, of reductionism, is impossible (NS, 10-May-2008, p52).
In the example of the piston moving in a cylinder of an internal combustion engine, all the text books take it for granted that the piston only moves in one direction (positively or negatively), and then proceed to do their calculations on P.V=n.R.T, and W=½.m.v2 et al, even to the point of simplifying the algebra as to dealing with scalers. Whereas, the significant thing about a piston moving in a cylinder is that the exploding gas molecules really do want to diverge in all directions in three dimensional space, and that the really inventive thing about the design is that most of them are prevented. This leads on to the fuzzy boarderline between droplets in a volume of gas, or bubbles in a volume of liquid; or a half land, half ocean a planet that can either be considered as having an island or a lake; or whether the pattern of the prime numbers is foreground or background; and also in network theory where rich connected networks (including the internet, 6-degrees of separation in human society, and neural nets in the human brain) where the gaps in the network that are just as significant as the connections. This, in turn, leads to the chaos-theory boarder-line between stasis and randomness, created by strategically placed gaps as well as strategically placed material, along which interesting-behaviour emerges along the knife-edge that is trodden by Mandelbrot sets, Conway"s-Life, consciousness, and biological life.
Examples of hidden symmetry (NS, 03-May-2014, p36) include a pencil that had been balancing on its point, that could have fallen in any direction, but in fact fell in this particular direction; or a pack of cards that could have ended up shuffled in any order, but whose symmetry is even broken by onlookers who single out a particular sequence as being remarkable (the same role that is played by survival-of-the-fittest when the pack of cards is a randomly evolving genome). A message only contains information if an alternative (counterfactual) message had been possible (NS, 24-May-2014, p30); and the process of wave-function collapse (wave-function collapse of a particle that was in a superposition of states) is another example.
Born out by the different types of engineering being based on their respective forms of energy (within the context of the first law of thermodynamics) that can be harnessed for energy transfer or for information transfer. According to Davies, the Hawking-Bekenstein formula for the entropy of a black-hole, the information content of the universe, in nats, is given by IU=G.(MU)2/(ℏ.c). Further, since this would be expected to have increased to its present value, starting at unity at one planck time after the big bang, this would lead to IU(t)=(t/tpl)2.G.(MU)2/(ℏ.c). Since the universe is now 1060 planck times old, this gives an information content of about 2x10121 nats, consistent with it having a mass of about 1053 kg (NS, 16-Dec-2000, p26; NS, 19-Oct-1996, p30).
Perhaps we need an extra law of thermodynamics (NS, 29-Oct-2005, p51). The new law would need to indicate the rate at which structure is built up (the rate at which sand grains are moved around by river flows, and the rate at which complexity builds up in DNA-based genonomes, perhaps as considered by Lloyd 1990) in the presence of a given amount of energy flow under the second law of thermodynamics (like a sort of Strouhal number for energy flow).
The 4th law is really concerned with the values of the three constants of proportionality, in Fourier's, Newton's and Stefan's laws of cooling. From experience, such as from microprocessor chips on PC motherboards, we know that the constant for radiation, despite its very agressive dependence on the difference of temperatures raised to the fourth power, is extremely small, and dwarfed by any conduction routes that are available to the heat, and that the constant for convection is even bigger. For the first two, the constant of proportionality, C, is a function of the ambiant temperature, C(T). The indication is that the constant of proportionality is also punctuatedly, but monotonically, dependant on time, C(T,t), where C(T,t2)≥C(T,t1).
The simplest single-celled microbes, and also for their pre-cellular precursors, and perhaps even non-biological matter would find itself being structured to make use of this source of energy (NS, 18-Mar-2017, p11). The universe is far from its equilibrium state, thereby allowing structures to form that aid the flow of energy towards the universe's equilibrium state. In its rush to use up the available energy as quickly as possible, temporary, local structures form (NS, 21-Jan-2012, p35). Examples of these spontaneously forming structures include convection currents, planetary weather systems (NS, 06-Oct-2001, p38), aauto-catalytic BZ reactions (NS, 21-Jan-2012, p32), protein-based life (NS, 09-Jun-2001, p32), and convergent evolution (NS, 21-Jan-2012, p35). Viewed in this way, humans are just the latest inadvertently evolved structures that help the universe use up its surplus energy supplies (NS, 05-Oct-2002, p30), which is a role that human beings seem to be taking on with great enthusiasm.
Zurek describes how S=H+K, where H is the Shannon entropy of the given sequence (statistical uncertainty), and K is the algorithmic complexity (algorithmic randomness), with a gradual decrease in the former balanced by a corresponding increase in the latter (but only made possible in a far-from-equilibrium universe, such as ours). Kondepudi, proposes that natural selection, and the processes of evolution, act on K, to keep it as low as possible, but that it is only H that allows machines to do work, and hence that for Maxwell's Demon, E=H.k.T.ln(2). Bennett showed that it is the act of erasing a memory bit that incurs the cost. Blank memory is like a cold sink at absolute zero, with the lowest Shannon entropy, H, even though writing useful memory subsequently increases the algorithmic information content, and reduces the algorithmic entropy, K.
Sagawa and Ueda took this further, and proposed (born out by experimental results) that an extra term needs to be added to account for mutual information, to account for the way that the act of measurement leads to a correlation between the system and the apparatus or its memory (NS, 14-May-2016, p28).
Tsallis proposes a formula for computing the entropy of an out-of-equilibrium system that happens to give the correct power-law answers (NS, 27-Aug-2005, p34). In this, probabilities are expressed as pq, which generates Boltzmann statistics for systems that are close to equilibrium, with q close to 1, but also seems to work for higher values of q that have external energy sources and that are far from equilibrium.
The proposed extra law, for the open, out-of-equilibrium system, along with its emergent behaviour, might take the form of a 'principle of increasing complexity' that explains how quickly life evolved, and consciousness too, over and above Maynard Smith's observation (NS, 05-Feb-1994, p37) as to why organisms tend to become more complex, all as agents to use up the available energy faster (NS, 05-Oct-2002, p30). In any haphazard system, unstable structures will quickly die away (by definition), leaving the more stable ones to persist (also by definition), established like the nodes in a standing-wave, giving the impression of a 'non-random eliminator' (NS, 07-Jan-2006, p36). One strong candidate would be the Principle of Maximum Entropy Production (NS, 06-Oct-2001, p38), not too unlike the electronic engineering concept of 'matching', that allows maximum power transfer (energy flow) from an input to an output.
Human activities can be viewed as convection currents in the biosphere, spontaneously setting themselves up to release energy that had been blocked or locked away in fossil reserves, to smooth the flow of ever greater amounts of energy from the hot end (the sun in our case, beating down on our planet's surface) to the cold end (ultimately the CMB). Similarly, the biosphere is a convection current in the inorganic atmosphere, which, in turn, has its meteorology. It appears that all these various levels of convection currents take on a fractal structure. It is also possible to identify intermediate, finer-grain levels: within human activity, there are the moments that the Agricultural, Industrial and Information Revolutions were sparked; and within the biosphere, there are successive nested levels caused by the evolution of amphibians from fish, reptiles from amphibians, and, specifically, the emergence of eukaryotes in the first place. Compare this with body-part complexity (NS, 05-Feb-1994, p37) and with Deacon's teledynamics (for energy channelling) versus morphodynamics (for energy sequestering).
Crutchfield and Young propose a way to measure complexity. Chaisson suggests using energy flow density, measured in ergs per gram per second (NS, 21-Jan-2012, p35).
Many argue that life in general, and RNA life in particular, are extremely easy to get started (NS, 20-Aug-2016). However, there is a suggestion that even revised versions of Drake's equation (NS, 25-May-2013, p6), complete with revised definitions of the habitable zone (NS, 08-Jun-2013, p40), are not yet allowing for the extreme unlikelihood of the re-emergence of multicellular life (NS, 23-Jun-2012, p32) precisely on the grounds of the laws of thermodynamics, and the need for a eukaryote-like serendipitous discovery of how to farm mitochondrial energy packs (NS, 23-Jun-2012, p32), either by phagocytosis or by extrusion from the inside (NS, 14-Feb-2015, p28). Others, though, suggest that the step from single cell to multicell organisms was not such an unlikely event after all (NS, 27-Jun-2015, p11).
For over a century, the academic world has concentrated on the role of the specialist: experts who know more and more about a narrower and narrower field. This is inevitable; the capacity of each human mind is finite, so the depth of the knowledge can only be increased if the width is restricted. However, the world still needs generalists, whose knowledge is not so deep, but integrated over a far wider span of subjects (NS, 30-Mar-2002, p54). Such people have, at their disposal, a very powerful method of problem-solving, and are able to find solutions to problems by analogy and cross-pollination across otherwise unrelated subjects. In addition, they are, indeed, also the integrators, bringing the work of the specialists together to the benefit of society.
Although not a formal academic source, the New Scientist magazine is a particularly appropriate one to use as a resource for this. As a weekly magazine in popular science, it provides an overview across a wide spectrum of the newest developments in science and engineering (and, in any case, can, of course, be followed up later in the more formal academic references). The aim is to find a "whole that is greater than the sum of the parts", but keeping aware that this will sometimes be confounded by Gödel-like inconsistencies.
We should not be surprised to see stable patterns persisting, since that is a tautology: by definition, stable patterns are the ones that do persist, while the millions of unstable one disappear quite quickly. This is connected with Dawkins, and the selfish gene, but the point is much more general, and not restricted to living organisms.
With forced resonance, for example, a washing line hung with a series of pendulums, each of a different length, when you sway the line at a given frequency, one pendulum happens to be tuned to that frequency, and sways energetically in time; but many of the others do, too, in a complex way, that extracts the energy from the incoming swaying, at its given frequency, but superimposed on their attempting to swing at their own natural frequencies. This sounds analogous to open systems (water falling down a channel under gravity, heat being applied to the bottom of a pan of water, single-celled organisms having nutrients flowing in at one end, and waste products out at the other), forced to live to the beat of the energy flow, despite really wanting to apply the second law of thermodynamics, and to die down in peace.
Given any random or chaotic environment, patterns are bound to form from time to time. The stable patterns will always out-live, and in that sense dominate, the unstable ones, by definition. There is a particular special case of this where those stable patterns have the property of inheritance (and hence of evolution). Evolution follows whenever there is also a mechansim for inherentance, in sporned copies of a given pattern, or persistance, in the chronologically later manifestation of the given pattern.
This leads to the idea of self-writing narratives (fiction or non-fiction). The idea of starting to write a story, or a non-fiction description, and then realising that there are anomalies and inconsistencies in the narrative (on an earlier page, character-A said this, so why, on this page, would character-B be believing this, or behaving like that?) Gradually, the story starts to write itself. I see the annotated draft pages of past revisions of a document as being like a static snapshot of a brain in the process of thinking something through. That is the same 'machine' that this web page is attempting to bring to bear on these New Scientist articles (plus paraphrases of quotes found elsewhere, notably on the Internet).
There is a fine line to be trodden, between the controller gving the direction for the story, and stiffling the message that is starting to emerge naturally.
It is surprising how much of physics can be inferred from simple thought experiments, based of course on the corresponding experimental observations.
Galileo noted that if you think, as most people did, and do, that a heavy body, m1, will fall faster, at v1, than a light body, m2 falling at v2, there is a logical inconsistency if you next tie the two of them together with a short length of cord or chain, and drop the two of them together again. By the above logic, m1 should be trying to fall faster than m2, and be pulled back by it, while m2 will be trying to fall slower than m1, and be pulled downward by it. So, the two will fall together at some sort of average velocity, such as (v1+v2)/2 or √(v1*v1). But, if they are falling together with a taught cord or chain, they are acting as a single body with mass m1+m2. But by the above logic, that should mean that the two of them tied together should fall even faster than v1 or v2 alone. Similarly, the same logical inconsistency arises if you assume, somewhat bizarly, that the heavier object falls slower than the lighter one. The only consistent conclusion is that v1=v2=(v1+v2)/2=√(v1*v2)=v1,2. All bodies, dropped simultaneously, irrespective of their mass, accelerate under gravity by the same amount, and hence end up travelling at the same velocity as each other, at each moment in time.
Next, assume that the KE of an object is a function of mass and its velocity: KE=F(m,v). Suppose that the object is a ball of putty that is heading for a wall. When it hits the wall, the first law of thermodynamics suggests that the KE is turned to heat (the wall gets a bit warmer). If you repeat the experiment, the same amount of energy will be delivered to the wall, again. Therefore, if you take two balls of putty, and throw them towards the wall, simultaneously, and at the same velocity, it will deliver twice as much energy as when you threw just one ball of putty at the wall. Similarly, if the two balls of putty are now lightly stuck to each other, and travel as a composite ball with mass 2m. So, KE appears to be linear in m, and we can say that KE=m*f(v).
Now, we change the experiment, as sketched out in Coopersmith (2017) and have two balls of putty, each of mass m, and velocity v, heading for each other. (We can worry, later, about one of them having velocity v, and the other -v.) When they collide, the come to a halt, and both heat up from their combined KE, KE=2*m*f(v). Suppose, though, instead you had been in a drone above one of the balls of putty, travelling at the same velocity as it. To you, that ball of putty would be stationary, and the other one would be approaching it at 2v. At the collision, the two of them suddenly end up travelling at velocity v away from your still speeding drone. So, the heating of the two balls is m*f(2v) less the kinetic energy that the new conglomeration still has afterwards 2*m*f(v). Given that the experiment is the same in both cases, just observed from a different frame of reference (from the roadside, or from the drone) the amount of heating must be the same: 2*m*f(v)=m*f(2*v)-2*m*f(v). So, f(2*v)=4*f(v). Few functions have this property, and f(v)=k*v2 is the most well-behaved of them all, and the one chosen by nature.
The statement earlier, that all bodies, dropped simultaneously, irrespective of their mass, accelerate under gravity by the same amount" leads to v=g.t, and more generally to v=u+a.t. It follows from this that k has the value ½.
We are fortunate in the way that inductive logic works in our universe: the chemical experiment or the clinical trial that we do today, in the controlled environment of the laboratory, are valid indications of how they will work tomorrow in the shopping mall. Not only are these symmetries already very useful in their own rights, but Noether showed that they lead to the conservation laws (NS, 25-Apr-2015, p33; NS, 27-Jul-2013, p50): the Copernican principle leads to the law of conservation of momentum; the temporal equivalent leads to the law of conservation of energy (the first law of thermodynamics). Taken together, the simultaneous conservation of enerrgy and of momentum lead to the second law of thermodynamics (two balls into a Newton's cradle leads to two balls out, rather than one ball twice as high, or three balls two-thirds as high, for example).
Waves can take all paths, Feynman-like, from one place (in phase space) to another. Each one, though, tends to interfer destructively with those taking slightly longer or shorter routes. Only the path of least action, with no shorter routes available, ends up not being completely cancelled out. With Newton's cradle, it all gets sorted out in the split second when the shock wave of the incoming balls causes all five balls to bounce around off each other like the balls in a Landauer reversible computer. A Newton's cradle made of 56g of iron in the five balls would have 56*3*6x1023 quarks and 26*6x1023 electrons (1x1026 quarks and 1.6x1025 electrons). It follows that the particles in a Landauer computer are without a sense of time, in their reversible collisions, but it is the levels of abstracted behaviour above that that experience time, thinking and consciousness, as emergent behaviour. (The contribution of entanglement and leading-diagonal decoherence might affect the conclusion for the Landauer reversible computer, though.)
This then implies Heisenberg's uncertainty principle, as noted earlier, by expressing it in terms of information theory (NS, 23-Jun-2012, p8). Indeed, the thought experiment of throwing a stone into a pond already hints at this connection. The kinetic energy of the stone arriving at one point on the pond's surface means that that energy is initially concentrated, and will either tend to diverge from there by the statistical mechanics of the second law of thermodynamics, or by the uncertainty in duration for the concentration of energy at a restricted point. This centres either on a E=kT or E=hf relationship, respectively. After that, once launched divergently, the energy continues to flow in a state of motion in a radial straight line, unless acted on by another force (Newton's first law of motion, but as modified by Einstein and by Schrodinger's equation).
Heisenberg's uncertainty principle then implies vacuum energy (the uncertainty of the energy, and hence the energy itself, can never reach zero) and also the third law of thermodynamics (the uncertainty of the temperature, and hence the temperature itself, can never reach zero (NS, 18-Mar-2017, p10)). The existence of the vacuum energy might then have implied dark energy, had it not been a massive 120 orders of magnitude out in its experimental predictions (NS, 01-Nov-2003, p34), which, it has been suggested, might be due to a missing leakage term (NS, 27-May-2017, p28).
Special relativity follows from the Principle of Relativity (constant motion of the system cannot be determined by observations made completely inside the system), and the Relativity of Simultaneity (simultaneous events in one context are not simultaneous in another). General relativity follows from generalising the Principle of Relativity still further (acceleration cannot be distinguished from a gravitational field).
Peres showed that the axioms of quantum mechanics necessarily lead to the second law of thermodynamics, and to Schrodinger's equation being linear.
In line with, but slightly contrary to the article in the 13-Oct-2012 issue of New Scientist magazine, Noether's Theorem (or perhaps the principle of least action) might, instead, constitute the long-sought theory of everything.