Towards a Theory of Everything
The front cover of the 13-Oct-2012 issue of
the New Scientist magazine
proclaims, "Surprise Theory of Everything: it's been here all along."
The thesis of the main article (p32) is
that the long-sought Theory of Everything might already have been found,
nearly two centuries ago,
in the form of the second law of thermodynamics.
It was shown that Heisenberg's uncertainty principle
can be expressed in terms of information theory (NS, 23-Jun-2012, p8)
with the momentum of the particle conveyed in one message stream,
and its position in another;
and that being able to decode both message streams would yield so much information
that it would be tantamount to violating the second law of thermodynamics.
This essay originally set out to act as an annotated index
into related articles in past issues of the New Scientist magazine.
If we were to consider the prospect of a time when the second law of thermodynamics did not hold,
we might allow for Maxwell's Demon-like behaviour to have been possible,
and for new energy to have been created out of nothing,
at an exponentially ever-increasing rate.
However, this is exactly what appears to have happened in the first fraction of a second after the Big Bang.
So, it is tempting to investigate if there could be a connection between the two ideas.
that means that it must have occurred during a timeframe where the currently known laws of physics break down.
There is no prospect of the Maxwell's Demon-like behaviour being manifested in machines or devices.
On the plus side,
this is precisely the timeframe where we are in need of a theory
that unifies General Relativity and Quantum Mechanics (NS, 24-Sep-2016, p28).
On the negative side, it presents a problem for the rest of this essay:
if the presently known laws of physics cannot be used, what can?
We do, at least,
have one boundary condition as a guide:
we know that whatever unknown laws of physics were applicable,
they must eventually collapse down to the known laws of physics in our epoch of the universe.
Reassuringly, this further implies that the exotic regime was not perpetual at all,
since the Maxwell's Demon-like behaviour would be using up some finite resource
in the creation of the extra energy,
which would serve to confirm the assertion at the start of this essay,
provided that the first and second laws of thermodynamics are adjusted
to include the finite resources of the initial exotic regime.
Here is a link for readers who now wish to go directly to the Conclusion.
The first split second
Perhaps, not coincidentally,
other events are also believed to have occurred during the first split second after the Big Bang.
Several key events seem to have taken place within the first
5.39x10–44 seconds of the universe:
- Appearance of the mass of the universe
- Inflation of space-time
- Quantum heat death of some of the particles
- Separation of gravity from the other three fundamental forces.
Energy and matter creation
At one level,
this comes down to the age-old question of why there is something rather than nothing.
To fit in with what we observe,
all the freshly generated matter/energy would have to have been generated in a highly ordered,
low entropy state,
with the balance between matter and energy at around 1:109,
and hence with marginally more matter than antimatter (NS, 23-May-2015, p28; NS, 12-Apr-2008, p26).
Indeed, there are several very fine balances
present in current theories:
- Matter and antimatter are created together, in equal (and self-cancellling) quantities.
However, our universe displays a bias of 1000000001:1000000000
in favour of matter, so that it is not all quite cancelled out.
- The way that the amount of normal matter seems to be dwarfed by the amount of dark matter,
in a ratio of 1:4,
needed to explain the orbital motions of galaxies.
- The way that the cosmological constant (perhaps manifesting itself as dark energy)
has almost cancelled itself out,
but not quite,
being lower than that predicted by quantum mechanics by a factor of 1:10120.
- The way that particle masses, and in particular that of the Higgs boson,
have almost cancelled themselves out,
by a factor of 125:1019,
in the so-called hierarchy problem.
- The way that the universe is in a state of unstable equilibrium
between false vacuum and true vacuum (NS, 29-Oct-2016, p32)
Some of these might possibily be explained by axions (NS, 14-Nov-2015, p36),
whose hypothesised existence in the present epoch of the universe
might be tested by searching for a particular type of
Bose-Einstein Condensate star (NS, 12-Dec-2015, p11).
Other, alternative, explanations are proposed, though.
for the marginal imbalance of matter and antimatter,
there are many experiments attempting to observe
instances of neutrinoless double-beta decay (NS, 13-Feb-2016, p30),
and the possibility of CP-violation having been observed in baryon reactions (NS, 11-Feb-2017, p14).
The period of Inflation (NS, 03-Mar-2007, p33)
is believed to have started at 10–35 s, when the universe had a size of 10–27 m,
and ended at 10–32 s, when it had a size of 103 m.
(According to this,
vacuum energy can be considered as a sort of latent heat of
phase change of some scalar field in a bubble universe.)
Energy creation arising from expansion
Inflation was originally proposed as a solution to why the universe is now so uniform and flat,
but it, too, also appears to be tied up the initial creation of matter/energy.
With inflation, at the time of the Big Bang, and now with expansion,
there is a sort of out-going wind of space-time coordinates,
as dark energy causes extra ones to be inserted.
Atoms, molecules, solar systems, galaxies and local groups are able to resist this wind,
since much stronger forces (electromagnetic, strong nuclear, weak nuclear and gravity)
are also present.
If a new voxel of space-time happened to pop up in our galaxy
(between the orbit of Mars and the Sun, for example, or between an electron and a proton in a hydrogen atom),
it would represent an injection of extra energy,
which would be manifested by increased probabilities of a photon being emitted.
All the planets we can see are in a stable configuration,
and with very stable orbits, and have been for billions of years.
Similarly, electrons around atoms are in their stable orbits.
If any space were to be inserted into any of these,
their first reaction would be to collapse back down to their stable configuration.
Thus, the expansion of space would be experienced as a miniscule increase in the amount of energy
(in reality, so miniscule that it would be an undetectably (by human technology)
slight increase in the probability of the emission of an extra photon).
Inflation and expansion, therefore, manifest themselves
as the injection of new energy into the system,
and with the spread of distance over which it happens,
it appears as a force.
Of the four forces, gravity is the longest ranging, but still becomes very weak at a distance:
so unrelated clusters of galaxies are pushed apart by the expansion.
This suggests that there is a point at which G.m1.m2/r2
balances with the force of the coordinate wind.
Expansion arising from energy creation
We might hope that there is a corollary
that says that new energy injection (from a Maxwell's Demon, for example)
could be what led to the inflation (NS, 15-Apr-2017, p8).
Noether's theorem indicates that a lack of symmetry in time
implies we cannot assume a conservation of energy.
This is consistent with expansion (or inflation) leading to energy creation.
The symmetry breaking event that turned on the Higgs field.
It is suggested that the mechanism could involve
the pushing up of the central peak in the middle of the upturned bowler-hat shaped curve,
thereby leading to the Mexican hat shaped curve in the Higgs field (NS, 14-Nov-2015, p36).
The Mexican Hat curve could have outlying alps,
that cause the Higgs scalar field to cause inflation (NS, 10-Jun-2017, p30).
Speed of light
One possibility is that the speed of light is not a constant,
but has only settled asymptotically at the value that we observe today.
At the intense energy densities of the first split second after the Big Bang,
one proposal is that the speed of light could have been much higher (NS, 26-Nov-2016, p8),
with a specific prediction about a measure called the spectral index for the CMB,
which should be 0.96478.
The split between a quantum and a classical macroscopic universe could be thought of
as a 'quantum heat death' of the universe (NS, 29-Mar-2014, p32),
which occurred a split second after the big bang.
The inset 'Unlimited chat' (NS, 29-Mar-2014, p35) suggests that,
prior to the quantum heat death,
because of the lack of the speed-of-light limit and of Heisenberg's uncertainty principle,
Maxwell's Demon behaviour would have been possible,
able to extract ever more energy from the background energy,
without reducing the energy in the background,
thereby filling the universe with bountiful quantities of extra energy.
The switch from exotic regime to normal regime
might happen in response to the creation of so much new material and energy
(from the proposed Maxwell's demon mechanism)
that the universe bloats out,
what we presently attribute to inflation or expansion (NS, 15-Apr-2017, p8)
and is pushed into a new regime of operation (no longer sub-planck scale,
so no longer able to support faster-than-light transmission,
and Maxwell's Demon behaviour).
Perhaps, even today, at sub-planck scales,
faster-than-light communications between
non-quantum particles still takes place (NS, 29-Jun-2002, p30).
If only a minority of the particles generated in the Big Bang
subsequently became quantum in nature,
dark matter might be made up of the remaining particles;
being non-quantum, and still capable of superluminal communication,
these particles might be expected only weakly to interact with the quantum particles of ordinary matter.
Maxwell's demon behaviour
Other possible mechanisms can also be proposed
for switching off the second law of thermodynamics,
such as something akin to reversing
the negative sign in Faraday's Law
(or the positive sign in Ampere-Maxwell's Law)
or something equivalent for one of the other fundamental forces.
Even without such a reversal, in our normal regime of the universe,
we see examples of how Maxwell's equations can cause the release more energy in a few milliseconds
than the sun does in a month (NS, 28-Nov-2015, p19).
low-entropy starting point for the universe is, at first, a puzzle (NS, 15-Oct-2005, p30).
When the cosmic microwave background (CMB) radiation came into being,
the universe was extremely uniform,
which represents a low-entropy state since the now dominant force (gravity) works towards a clumping of matter.
There must have been a point at which the universe switched from this force being dominant
to gravity being dominant (NS, 12-Nov-2005, p27).
other monotonic processes
(using up of an unreplenished store of energy, or for reaction products, or of contaminents).
Perhaps the mathematical model of the low entropy start of space-time can be transformed,
for example using Noether's theorem,
to a symmetrical model in some other mathematical space (Derek Potter on Quora).
Gravity as a mechanism for wave-function collapse
Penrose suggests (NS, 09-Mar-2002, p26)
that it is not a coincidence firstly that it is when the effects of gravity start to become noticeable
that the Standard Model starts to break down as a workable approximation of how the universe works,
nor secondly why gravity is the one force that resists being unified with the other three.
Indeed, it could even be because of gravity
that quantum entanglement experiments are so difficult to perform on Earth (NS, 20-Jun-2015, p8).
The quantum mechanical uncertainty of the energy of a particular region of space,
with a given uncertainty of time over which it persists, would translate, relativistically,
into an uncertainty of the curvature of that region of space,
and it is perhaps this non-linearity that collapses any wave-functions that are in a state of superposition.
Moreover, when a massive particle is in a superposition between two places,
it must have implications for the curvature of space at those two locations (NS, 03-Jan-2015, p26)
and anything that is being attracted to the particle in either of those two locations;
similarly for the relativistic time-dilation experienced by a
particle in a superposition of two velocities causing the two instances
to experience the passage of time differently.
However, the AdS/CFT has been used as an argument against this,
since it shows that gravity can be taken out of the equation,
albeit not in our universe but a hypothetical one with negative curvature.
Modified Newtonian Dynamics (MOND) was originally proposed as
an explanation for star rotation within galaxies, without needing to propose the existance of dark matter.
A mechanism that switches from a Bose-Einstein Condensate state to an ordinary dark-matter state,
depending on the strength of the gravity field,
would allow a switching between MOND and inverse square law behaviour (NS, 02-Apr-2016, p30).
It is further suggested that dark energy, too, and also gravity itself,
could be emergent behaviour of entanglement (NS, 18-Mar-2017, p28).
It is suggested (NS, 13-Oct-2012, p32) that gravity might simplly be a consequence of the second law of thermodynamics.
Gravity might just be a consequence of interactions between entangled bits of quantum information.
Newtonian and Einsteinian gravity has already been successfully modeled this way,
but in anti-de-Sitter space,
though not yet in our universe in which the vacuum is not quiescent.
However, it might still emerge if the entanglement in the particles generated from the vacuum energy
can be modeled as a type of elasticity
(that deforms like Einsteinian gravity at short ranges, and extends like MOND at longer ranges).
Some suspected black-holes might turn out to be boson (BEC) stars (NS, 15-Jul-2017, p28).
Low entropy origins of the universe;
Maxwell notes that life is focussed on decreasing entropy;
Bennett, based on Landauer, based on Shannon, explains Maxwell's demon;
information is physical;
Laplace's demon predicts the future; Bekenstein's holographic principle of black-holes;
gravitational attraction of two bodies is a consequence of their entropy needing to increase;
time is a derived parameter (NS, 13-Oct-2012, p32).
Attempts at a theory of everything
The two great pillars of modern physics,
General Relativity and the Standard Model continue to be used in isolation, with unprecedented success,
as long as each is restricted to its specific domain;
but, when attempting to understand things like black-holes and the properties of the early universe,
the two theories need to be combined,
and so far, they have resisted all attempts at doing this.
"While quantum mechanical predictions comply with special relativity (at an operational level),
objective properties, such as non-locality, do not satisfy the same causal relations
(at an ontological level)."
Also, there is no account of gravity, even that of general relativity.
String theory is one attempt,
albeit one that is wildly impractical to test.
M-theory is an attempt at a unification of many of
the alternative variations of string theory (NS, 19-Apr-2014, p47; NS, 28-Sep-2013, p34).
There is a relationship, too,
between octonion and quaternion numbers and M-theory (NS, 09-Nov-2002, p30).
Lisi suggests how the sub-atomic particles
can be mapped on to a 248-vertex E8 pattern in 8D space (NS, 17-Nov-2007, p8).
With string theory,
the extra dimensions are assumed to be stunted, or curled up,
into less than a planck length.
they are assumed to be fully-fledged dimensions,
of which our 3+1 dimensions form just a membrane (NS, 29-Sep-2001, p26).
Many worlds and the Multiverse
There are many completely different types of parallel worlds theory, multiverse theory,
and many worlds theory (NS, 21-Jan-2017, p28).
Tegmark presents a four-level classification (NS, 26-Nov-2011, p42).
Most versions would have implications for free will (NS, 27-Sep-2014, p32).
Smolin argues, though,
that they are just devices
for handling our lack of knowledge about the universe (NS, 17-Jan-2015, p24).
Wiseman proposes a "Many Interacting Worlds" model (NS, 08-Nov-2014, p6),
in which the behaviour of the quantum mechanical system is
the blurred average behaviour from several universes (of the order of 41) that
interact fairly strongly with each other.
Loop quantum gravity
Relativity assumes eternalism (all spacetime exists)
while quantum mechanics tends to assume presentism (NS, 03-Jun-2017, p44).
It is suggested that time must be relative, rather than absolute (NS, 08-Oct-2011, p37).
Indeed, there is a question as to whether time
or space, or both, are derived, or emergent properties of something more
fundamental (NS, 15-Jun-2013, p34),
and as to why there are three dimensions for space (NS, 28-Sep-2013, p34).
Rovelli and Smolin proposed Loop Quantum Gravity (LQG)
as an alternative to string theory for unifying relativity
and the Standard Model.
The subatomic particles could be caused by
vibrations in the granuals of space-time at various modes,
just as in string theory,
and not unlike Wolfram's view
that the universe might run as a cellular automaton (NS, 21-Jun-2003, p32; NS, 06-Jul-2002, p46).
LQG replaces the notion of a top-down, external, all-encompassing framework of space-time coordinates
with a bottom-up, nearest-neighbour, local interface between atomic granules of space-time.
This implies that it is working something like a cellular automaton, with nearest-neighbour communications,
working on simple, local rules whose amassed behaviour (summed over huge assemblages)
would approximate to our familiar laws of physics.
a photon entering one granule on one side and exiting on another might be described by some operation,
and so the beam of light, and our Euclidean geometry,
would end up as some averaged value obtained by integrating over lots of instances of:
lastcell = photonpassage( photonpassage( photonpassage( photonpassage( photonpassage( photonpassage(
photonpassage( photonpassage( photonpassage( photonpassage( firstcell ))))))))))
There are indications (NS, 11-Mar-2017, p28) that LQG and string theory could be compatible,
not least at the two-dimensional boundary of a holographic projection.
It could be that the string length is somewhat bigger than the granual size,
with string theory being supported on a fine mesh of these granuals,
thereby explaining why string theory sees space-time coordinates as a fixed background.
Experimental observations are proposed that
might determine whether space-time is quantised, and at what granularity (NS, 07-Mar-2015, p12).
Others are proposed, to look for astronomical evidence
of black-holes that collapse to the quantum loop size,
and then rebound as a white hole at a characteristic frequency (NS, 02-Jan-2016, p32).
In Causal Dynamical Triangulation (CDT),
the granuals of space-time are tetrahedrally,
nearest-neighbour connected (NS, 14-Jun-2014, p34).
There could be three phases that such interconnections can be in:
- totally communicating (as in a Bose-Einstein condensate)
- totally uncommunicating and silent
- and an intermediate phase that appears to be the one adopted by most of the actual universe.
Using amplituhedra as a sort of multidimensional version of Feynman diagrams,
giving the results of calculations in quantum chromodynamics equal to the multidimensional polyhedron's volume.
Not only does the method generate tractable and correct results,
but it suggests that 'locality' is an emergent feature.
Unfortunately, at present,
the tool only works for super-symmetric quantum mechanics.
Rovelli suggests that all of the
fundamental particles might consist simply of braids of space-time (NS, 12-Aug-2006, p28),
which might then explain why the universe appears quantised (NS, 10-Nov-2001, p40).
Entanglement, too, might be explained by topological
properties of the fundamental particles (NS, 08-Jan-2011, p10).
There is even a suggestion that the long-sought proof of the Riemann hypothesis, in
mathematics, with the zeros of the zeta function for prime numbers all lying on
the vertical line 0.5+n.i, might be found first in the states of a suitably
chosen quantum system, such as an atom or molecule (NS, 11-Nov-2000, p32),
and that random-matrix theory might be used as a tool to help find it (NS, 10-Apr-2010, p28).
the same sort of Artificial Intellegence program
can be applied to Quantum Mechanical systems
as was applied to playing Go (NS, 18-Feb-2017, p12).
Smolin points out (NS, 03-Feb-2001, p50)
that one philosophical problem with relativity is that,
by abstracting out time as just being another dimension,
it means that all the physical laws become constant, invarient,
and outside of time (NS, 23-Sep-2006, p30; NS, 22-Nov-2008, p32; NS, 20-Apr-2013, p30),
which is strange for a universe that has existed for only a finite time.
Moreover, such a view leaves us with no way of explaining
why we have the concept of 'now',
and indeed of past, present and future.
Any theory that attempts to explain the manner of
the arrow of time 'arising',
entropy 'growing', and the 'period' 'when' all this did so (NS, 16-Jan-2016, p8) is,
on the face of it,
using self-contradictory time-dependent terminlogy.
It implies some hypothetical absolute time, outside our universe,
that is distinct from, albeit parent to,
the local time that we experience within our universe.
All concepts of 'growth' and 'arising' therefore make some direct or indirect reference
All of this does run up against the impossibility (NS, 08-Oct-2016, p52)
of our attempting to think
what anything might be, outside of time or space
(since, following Descartes, to think involves both time and material matter).
Ultimately, it becomes tied up with the traditional question of
why there is something, rather than nothing,
and indeed what it would even mean for there to be nothing.
These are the questions of where and when is the universe,
which leave us flumuxed, even before we attempt to address those of
what and why is the universe.
let us consider time to be just another spatial dimension, w,
and hence as one of four dimensions (w,x,y,z).
One way to envisage this is to consider a reel of celluloid movie film
that has been cut up into its individual frames,
with all the frames piled up in sequence.
The frames represent just two spatial dimensions, x and z,
with the position in the pile, w, used to represent time
(with the third spatial dimension, y,
implied by the use of perspective within the frame).
We can contemplate how the positions and shapes of objects change
in this pile of cine frames.
We could trace the movement of a sugar cube
as it first enters into the scene in a sugar bowl,
then gets lifted and dropped into a cup of tea.
Ultimately, though, the beginnings and ends of any object
only make sense if we are tracing the paths of the constituent fundamental particles
(which are not only atomic, but also dimensionless).
Macro objects, such as sugar cubes,
are then just akin to shoals of fish or murmerations of birds.
There are constraints on the shapes that can exist in (w,x,y,z).
Stable patterns within (x,y,z) tend to extend further in w than unstable ones.
There is a particular special case of this
where those stable patterns have the property of inheritance (and hence of evolution).
When computing the entropy within any closed region in (x,y,z),
it will be a function that is monotonically dependent on its extent in w.
molecules of gas confined to one of two adjacent chambers on the x-axis, when the partition is suddenly removed,
must occupy greater ranges of x values for increasing values of w.
However, it is not a linear relationship,
and instead must reach the x values asymptotically.
Ultimately, all particles in the universe have asymptotic values back towards the Big Bang,
forward to the current value of w, forward to the end of the universe, and back again towards the current value of w.
This gives two distinct periods: the past from the Big Bang to the current asymtotoic value,
and the future from the current asymptotic value to the end of the universe,
and hence a concept of 'now' at the interface.
But unrelated parts of the universe are not governed by an external, global clock,
so their values of 'now' are uncorrelated.
Via Newton, we have invented functions that are paramaterised in a mythical global time, t.
It is only when the parts encounter each other, that their values of 'now'
(which, in any case, should start as a fuzzy, extended-now) become the same,
somewhat akin to the way that two functions f(x) and g(x) can have all sorts of uncorrelated values
except after stipulating the point where f(x)=g(x) and solving the simultaneous equations that result.
Particle collisions that synchronise under the clock at Waterloo station at a given time and date.
Our notion of cause followed by effect maps on to the patterns in the w direction,
and is constrained by
As a result, stars orbiting round the centres of their galaxies,
and galaxies orbiting round their local groups,
do not have the same view of gravity as spacecraft going up to service the ISS
Turing's Halting Problem indicates that we cannot just treat computer programs in the abstract, as a static whole,
stored on paper, a CD-ROM or a deck of punched cards;
it is necessary to state the specific data that are to be supplied,
and to consider the dynamics of its execution
(passing under the read-head of the debugger's finger on the printout, the program counter,
or its equivalent in lambda calculus).
Similarly, a story cannot be treated in the abstract, as a static whole,
in a book, DVD or stack of cine frames;
it is necessary to consider the specific data that it will be immersed in,
and to consider the dynamics of how the story is to be told
(passing under the read-head of the page-turner, DVD player or cine projector).
And, when those stories are not just biographies,
but human lives, we know that we have to consider the read-head of the moment that we call 'Now',
and how they can only think consciously in a singled sequential thread).
that if the universe is to be treated as an information system,
we must first establish its basis in logic.
Heisenberg's uncertainty principle then drops out as a natural consequence:
the logic that leads to a measurement of the momentum of a particle,
contains no statements to describe its position,
just as in Gödel's incompleteness theorem:
that any system of logical statements, of sufficient complexity,
are bound to contain statements that cannot be proved
(such as, in this case, the position of the particle being 4nm from the reference point,
or another that it is 6nm).
It also links with the indications noted earlier
that Heisenberg's uncertainty principle follows
as a consequence of information theory
and the second law of thermodynamics (NS, 13-Oct-2012, p32; NS, 23-Jun-2012, p8).
Fractals might be one explanation for the opposing views of quantum mechanics and relativity
both being correct (NS, 28-Mar-2009, p37)
indicating that some Gödel-like questions about the universe
"what if the experiment had measured the momentum of the particle instead of its position")
might have no answer because they do not lie on the same fractal coastline of the universe.
Arrow of time
There are many proposals for what it is that gives the feeling of the passage of time (NS, 04-Feb-2017, p31).
Ellis notes that none of the present physical models capture
our feeling that time flows (NS, 02-Nov-2013, p34),
and Gisin adds that none of them capture our feeling of having free will (NS, 21-May-2016, p32).
The notion of time flowing does still beg the question, of course,
of flowing with respect to what?
In the case of thought experiments involving astronauts passing their twins
at close to the speed of light,
or involving a trip close to the event-horizon of a black-hole,
the rate of flow is comparative, between two human observers.
However, we also use the term about our own, everyday sensation of the passage of time.
One argument is that it is our sense of identity,
that the person who started reading this sentence
is the same as the person who ate breakfast this morning:
one thing that we could be measuring the flow of time against
is our laying down a permanent and ever-growing trail of past memories.
It is not unlike the illusion created by our image in the
mirror being based in left-right symmetry, and not up-down symmetry.
It is further supported by a proposal (NS, 15-Aug-2015, p26)
that consciousness is just an illusion that the subconscious brain concocts,
compatibly with Libet's experimental observations, as an aid for the
survival of the species (thereby also explaining phenomena such as pantheism,
phantom limbs and post-event rationalisation of otherwise odd behaviour).
Perhaps free will just means deterministic behaviour,
but dominated by internal interactions within the system
(along the lines of Tonino's integrated information theory).
Contrary to the popular joke,
it is the speed of light that is nature's way of making sure
that things do not all happen at once (NS, 04-Mar-2017, p28).
Both relativity and quantum uncertainty agree
that there is no such thing as simultaneous.
The Newtonian laws of motion already encounter problems when taking causality into account.
When a decision is made to throw a ball, or to pull away in a car from a set of traffic lights,
position, velocity and acceleration all suddenly change from zero to some positive value,
as do all the higher derivatives,
with a danger of some of them might suddenly incur a step function,
with an infinite slope for the next derivative up.
the universe avoids this by allowing all the changes to be blurred,
allowing a continuous ramping up of each of them, with no step changes.
However, because cause must always precede effect,
none of the blurring can occur before time=0.
Interplanetary transport networks (NS, 25-Mar-2006),
but applied to all fields, not just gravity, might offer one way into this,
inasmuch that it allows situations where all the derivatives of each of the alternative paths
can become equal (indeed, zero) at the saddle points.
It is generally believed
that it is the increase in entropy that gives our real sense of the direction for the flow of time.
Consider the task of judging whether a cine film of a man
passively sitting on a swinging swing
is being run forwards or backwards in the projector.
First, there is the finite speed at which cause is followed by effect,
and at which various waves travel
(as for example, when the man fires a gun, and the flash of light precedes the sound).
Second, there is the direction of increasing entropy
(if he drops a porcellain cup, or spills a box of matches, for example).
Even a film of a porcellain cup being constructed on a potter's wheel, and then fired in a kiln,
has the agent of the entropy increase apparent (the potter, or the kiln).
Similarly, a time-lapse film of a construction site looks odd, perhaps humorous,
if the agents of the entropy-increase (the construction workers and their machines)
are no longer apparent.
The box of matches example seems to confirm that time is an emergent property.
Each match is completely unchanged, and could have its trajectory reversed.
It is only the structure of the matches organised in the box, or disorganised on the ground,
that has changed, and that gives the concept of the passage of time.
(In this particular example,
there is also the process of the matches falling to the ground under gravity;
so, inert ink particles mixing in water
might have been a better example.)
One possibility as to why the universe started off in such a low entropy state
is that, to us, looking from inside the universe,
the moment of low entropy state simply appears to be the start of the universe,
and of local time (even if it wasn't)
as a direct consequence of our own definition.
Even if the universe had not started in what we infer to be its lowest entropy state,
what we perceive to have been its lowest entropy state
would look to us to have been when it started.
Just like the characters in a cine film,
we cannot tell if the film is being run backwards in the projector.
Moreover, the actors can rehearse, and even do the takes of the scenes,
in any arbitrary order,
stitching the characters' chains of thinking together mentally, in their minds,
and physically, in the cutting room.
Measurement of entropy is connected to the question of whether one particle can be substituted for another,
and comes down to the number of degrees of freedom.
Electrons, Up quarks, and Down quarks, have no internal structure,
and can be substituted for each other without any observer ever noticing.
Next, although protons and neutrons are made of a mixture of Up quarks and Down quarks,
with various permutations of red, green, blue possible,
confinement means these internal data are transparent to our experiments.
Next, with Bucky balls,
it seems that C60 molecules (with its 360 electrons, 1080 up quarks and 1080 down quarks,
and overall boson-like properties)
still has no discernible landmarks
(if someone were to substitute one of those carbon atoms by another, for example).
For the DNA in a small virus, though, it is not so transparent:
finally, we have a clump of electrons, Up quarks and Down quarks
that does have discernible landmarks,
and hence an entropy to its structure, and hence a memory.
No-one has yet obtained interference fringes
by firing small virus DNA molecules through a double-slit experiment (NS, 15-May-2004, p30).
Perhaps the difference between sending a buckminster fullurene molecule or a small virus
through a double-slit experiment (NS, 17-Mar-2007, p36; NS, 15-May-2004, p30)
is that the latter has more of an odometer or local clock,
with increasing local entropy,
as its DNA sequence becomes degraded
(by methylation, telemer damage, or simple non error-corrected base-pair swapping).
This leads on to the thought that
there can be a seething chaos within, but a billiardball-like simplicity outside
(such as the way we view the sun as the centre of our solar system);
and it can be experiencing the passage of time within,
but timeless when viewed from the outside (such as the passage of a proton in our apparatus).
Deacon arguments (NS, 26-Nov-2011, p34).
Expansion of space-time
Dowker proposes that the flow of time is caused by the expansion of the universe,
and a measure of the new qubits that are thereby being added (NS, 04-Oct-2003, p36).
This new space is presently being created,
at the rate of 69.3 km/s for every 3.26 million light years,
which equates to 2.25x10-18 m/s per m,
which is a 1/800th of the diameter of a proton per second per metre,
or 1.39x1017 planck lengths per second per metre.
Muller(2016) proposes that as space-time expands,
it is the new coordinates of time that feel like 'now', and give rise to the feeling of the flow of time.
'Now' is simply the raw face at which new instances of time are being created as space-time expands,
rather than some notion of the increase in microstates
(caused by the expansion of space-time) for entropy to expand into.
Time is influenced by (and ceases to have its familiar properties)
in the presence of intense gravitational fields,
near the event-horizon of a black-hole, for example.
In loop quantum gravity, when the universe is compressed,
the speed of light takes on an imaginary value,
and time becomes a fourth space dimension;
and hence suggests that the emergence
of time was caused by the breaking of this symmetry.
Bekenstein showed that the surface area of the event-horizon of a black-hole
corresponds to the zeroth law of thermodynamics.
Taking such a view, though,
leads to a problem with the disappearance of information
when matter enters the black-hole,
and its subsequent reappearance as Hawking radiation (NS, 04-Feb-2017, p16; NS, 27-Jul-2013, p10),
and with it perhaps still being available to forensic investigators
at the event-horizon (NS, 19-Sep-2015, p11).
So, the event-horizon might either be considered
too big to hold all the information,
since the information is disappearing from the outside viewer,
or too small to represent the infinite amount of space-time that is being created at the singularity.
One attempt to resolve this
is to propose that the whole universe is a hologram (NS, 17-Jan-2009, p24; NS, 27-Apr-2002, p22),
and that reality is somehow lived out, in time,
distributed round a two dimensional surface.
Such possibilities as the universe being holographic might be detectable
if space-time turns out to have a preferred 'weave'
in specific directions (NS, 16-Aug-2003, p22).
Perhaps black-holes could harbour bubble universes (NS, 09-Jan-2016, p8)
along with all the information that they contain.
Moreover, it could be that the universe is filled with primordial black-holes
(where the information is stored holographically in their event-horizons),
but with those black-holes at varying densities
because of Heisenberg's uncertainty principle;
our habitable part of the universe
would then be in one of the low density (and hence low entropy) regions,
where radiation is able to permeate (NS, 28-Apr-2007, p28).
Maldacena's Anti-de-Sitter/conformal field theory (AdS/CFT),
which permits a hologram-like conversion between a 10-dimensional string-theory space with
gravity and a simpler four-dimensional QM space (NS, 12-Oct-2013, p36),
also would only work if the curvature of space-time were negative (NS, 30-May-2009, p34).
It might, though, offer a way to unify the space-time view of relativity
with the particle-field view of quantum mechanics (NS, 04-Feb-2017, p23).
The AdS/CFT duality leads to a suggestion
that information has an interpretation in terms of space-time (NS, 07-Nov-2015, p30).
This might argue that gravity plays no part in wave-function collapse,
since gravity is only an optional interpretation on one side of the AdS/CFT duality,
but might still resolve the EGB (spooky action at a distance) paradox (NS, 02-Jan-2016, p53).
The paradox of quantum monogamy
(no more than two particles can be in the same entangled state at the same time)
between three particles
(one on each side of a black-hole's event-horizon,
and the other at the other end of a wormhole)
can be resolved by noting that the one at the other end of the wormhole
is in the future, or past, of the other two.
The particle at the other end of the wormhole
is never at the same time as either of the other two,
though it might be at the same space as one of them.
Perhaps it would be possible for the machinery that we have for Pauli's Exclusion Principle
to be adapted to describe a similar-sounding principle of quantum monogamy.
Holographic universe, event-horizon information,
ability to drop a dimension (and gravity) in AdS/CFT,
and hence conversely introducing a dimension is what adds gravity to the standard model (NS, 11-Feb-2017, p24).
The resolution of the search for the 3D Ising model,
via a bootstrapping approach,
could solve many problems concerned with phase transitions,
many-body strongly coupled systems, superconductivity,
and quantum mechanisms of the strong nuclear force and the AdS/CFT duality (NS, 18-Feb-2017, p28).
Curvature of space
Gurzadyan suggests that the arrow of time
might be a simple consequence of the curvature of space,
and that this would avoid the need to postulate a period of inflation;
this would only work if the curvature of space-time were negative (NS, 15-Oct-2005, p30).
It has been suggested (NS, 18-Jun-2016, p28)
that an overall negative curvature could indeed be what has resulted
from the large voids that have formed,
due to clumping of the galaxies over the most recent 5 billion years
(as a back-reaction to space telling matter how to move, and matter telling space how to curve).
Importantly, this might explain the results that are currently attributed to dark energy,
thereby avoiding the need to propose the existence of such a thing.
Dark energy might also be explained by the creation of information by wave-function collapse
(Sudarsky, Josset and Perez).
Increasing information implies increasing entropy.
The arrow of time might then be due to the particles of the universe
becoming ever more entangled (NS, 04-Feb-2017, p31).
One study suggests that entanglement is the limiting condition (NS, 21-Aug-2010, p33),
and another that Heisenberg's uncertainty principle and entanglement are, in fact,
two sides of the same coin (NS, 30-Apr-2011, p28):
either one applies, or the other, depending on the nature of the experiment
(but with H.U.P. as the overarching principle).
Tollaksen found that entanglement still leads to apparent contradiction
even when the measurement is taken before the particles are entangled (NS, 02-Aug-2014, p8).
Entanglement in time (NS, 27-Aug-2016, p12) consists of a particle becoming entangled
with an earlier version of itself (NS, 27-Mar-2004, p32).
Perhaps an extra dimension could be involved,
perhaps a stunted one that only has two possible states: 0 or 1 (past or future);
or, perhaps this Boolean information could be encoded in the states of the six curled up extra dimensions
that string theory hypothesises
(thus once again making the model a static one in ten dimensions,
but still dynamic along the time axis when viewed from our perspective from within).
The concept of 'now' could be simply the contents of our three spatial dimensions
(complete with their measure of entropy) at their point along the time dimension
where these extra dimensions undergo the sort of phase change
between past and future that Ellis' proposal seems to imply.
Perhaps the past and future are different phases of the universe,
and that 'now' is a wave front of phase-change,
as it rips across the universe, and that we, by definition,
live out our experiences entirely on this wave-front.
Even in digital electronic computing,
the program counter of the conventional processor
captures the concept of 'now',
and is perhaps a reason that 5G research failed to catch on beyond the 1980s.
Interestingly, Wootters proposes
the use of one of the otherwise unused dimensions, too, to serve as the
physical manifestation of what we presently handle in mathematical models of
quantum mechanics using the square-root of minus one (NS, 25-Jan-2014, p32).
√-1 is merely used as a mathematical convenience
as a means of introducing a unit vector
that is orthogonal to the implied, identity-operator, unit vector of 1 for the real numbers.
There is a distinction between things that have memory
(like human brains and biological genomes),
but also including objects that merely persist (like rocks and rivers),
and those that do not (such as momentum, force, energy and power,
that are functions of the present state of the objects concerned, not of their pasts).
As noted earlier,
memory is key to an object's identity,
whether it be a particle passing through a double-slit experiment,
a human who thinks (and therefore who is),
or the genome of a species.
Also, it is the act of erasing a memory bit that incurs the thermodynamics cost.
Back in the 1990s,
in at least one university, they had found (for security reasons)
that they had to lock all their PC hard-drives to be read-only.
Students would carry their work around with them on floppy disks.
The floppy disk could also carry other, meta, information, like the user profile.
In a very real sense, these students were able to work on one PC, advance their work a bit,
save it to floppy disk, go off to a couple of lectures, then go up to a different PC,
and resume their work, with their own personalised profile, on a completely different PC,
as if nothing had happened inbetween.
This demonstrates how memory is the key to identity.
The students' computing identity did not reside on the PC (worth several hundreds of pounds)
but on the floppy disk that they carried around with them (worth a few pence).
Similarly, my identity with the person who ate breakfast this morning is firm
because of the memory of eating that breakfast.
Conversely, it does not matter if someone claims that I am the reincarnation of some medieval knight;
if I cannot remember anything of my past experiences (NS, 22-Apr-2017, p28),
it is not much of a reincarnation.
Time-travel would only be possible if eternalism is right, and presentism is wrong.
There are many paradoxes surrounding time-travel,
and many attempts at proposing resolutions to them (NS, 28-Mar-1992, p23)
assuming time-travel to be possible (NS, 20-Sep-2003, p28)
and noting that relativity is somewhat ambivalent on the prospects
of time-travel (NS, 20-May-2006, p34).
Some argue that the second law of thermodynamics could give the answer
(time-travel is possible,
but must cost the time machine at least as much energy to run
as that needed to achieve the entropy change that the time-travel brings about).
The problem of the speed of time,
and the notion of dtlocal/dtabs,
is also encountered when considering a discussion between
of two hypothetical time-travellers comparing their two time machines
(ignoring its possibility or impossibility).
How could one traveller
say to the other, "My machine is better than yours: it can go at one century per second,
but yours can only go at 50 years per second"?
Whose seconds would they be talking about?
Intuitively, one would expect those seconds
to be some sort of perceived seconds,
within the passenger compartment of the respective machines,
perhaps as indicated on their respective wrist-watches,
or rather, by their various body clocks,
and thence by some measure of local entropy increase,
numbers of bits being erased,
and hence of memories being recorded,
and of histories being laid down (including that of having eaten breakfast that morning).
A time-traveller, going to the future and back
would be gaining knowledge and experience,
and clocking up seconds on the odometer,
just as a car's kilometerage keeps clocking up
despite never straying from the daily commute.
A molecule (of sufficient complexity) in a sound or fluid wave would similarly clock up more on its odometer
than in its displacement
(odometer is to displacement as speed is to velocity,
perhaps with a notion of some sort of Strouhal number).
Temperature of the universe
Following on from a 1979 paper by Freeman Dyson (NS, 03-Aug-2002, p28)
it would at first appear that machines in general
(and living cells, and thinking brains, in particular)
can continue to eek out a slower and slower existence, forever,
right into the heat death of the universe,
as the amount of temperature variation gets smoothed out.
However, the universe is expanding,
with potential supplies of energy constantly going over the de Sitter horizon,
out of reach of future generations (NS, 20-Oct-2001, p36).
Noting the similarity to our view of an event-horizon of a black-hole,
this leaves unanswered one nagging aspect:
one case is completely subjective (or at least, observer-orientated),
while the other is at least partially objective,
inasmuch that the particles concerned
can be aware of there being an intense gravitational-field near-by.
The monotonic disappearance of potential resources over the de Sitter horizon
is shown to be a manifestation of the second law of thermodynamics (NS, 15-Apr-2017, p8).
However, expansion equates to the injection of new matter.
Meanwhile, the machine of Freeman Dyson will get warm,
and will need to dissipate heat as it works,
so the Hawking temperature imposes new constraints on the duty cycle of the machine.
|Time (s)||Temp (K)|
|Thermal equilibrium era (p:n=1, γ:(p+n)=1.0e9)||1.0e-2||1.0e11|
|Hydrogen era (p:n=6)||1.0e0||1.0e10|
|Deuterium era (p:n=7)||1.0e2||1.0e9|
|First atoms (CMB)||3.8e5*3.2e7||1.0e3|
The table, here,
represents the merged data from two articles (NS, 03-Aug-2002, p28; 05-Jul-2008, p28).
The temperature of the event-horizon of a black-hole
is inversely proportional to the mass of the black-hole,
where the constant of proportionality is (ℏ.c3)/(8π.G.k),
A 4.5x1022 kg black-hole would be in equilibrium with the universe,
since it would have a temperature of 2.7281 K.
Thus, our universe, with a mass of 1053 kg (NS, 16-Dec-2000, p26),
is out of equilibrium,
and will continue to be so until it reaches a temperature of 1.2x10–30 K.
The question addressed in the New Scientist articles is, "at what time will this be?"
The first row of the table says
that 1 natural unit of time (√(G.ℏ/c5),
about 5.39x10–44 seconds) after the Big-Bang,
the universe was at 1.23x1023 natural units of temperature (2.4x1032 kelvin).
In the row just after that, it appears that t.T2 was nearly constant
(at about 1020)
for a long period, having cooled from an initial value of 3x1021.
With a bit of reverse engineering,
an expression can be derived to describe the temperature of the universe as a function of time.
t.T2 is now down to 3.2x1018,
and appears to be followed by some sort of a inverted exponential decay towards unity.
With a bit of curve fitting,
it turns out that log(t)+2*log(T)=21.7*(1–exp(0.0454*x))
summarises the contents of the table quite closely
(where x represents the projected time for the end of the universe).
This suggests a figure that is getting on for 1060 seconds
for the lifespan of the universe
(about 2x1052 years).
where the numerator is 3x1021 with some sort of fiddle factor.
Indeed, it should be noted,
that the above expression is not claimed to be the one that
actually models the behaviour of the temperature of the universe.
The expression is simply offered here as one that succinctly captures
the data that were published.
and if T2.t was roughly constant in the early universe,
this suggests that IU(t)∝(MU)2/T4
in the early universe, suggestive of Stefan's law.
Information content of the universe
The first step when encountering a new phenomenon is to start taking measurements;
the second is to look for underlying relationships between the parameters;
and the third is to propose physical mechanisms that might generate those relationships.
Despite a century of intense study,
quantum mechanics appears to be still stuck at the second stage,
with phenomena characterised by probability functions, extremely successfully,
but without any convincing explanation as to the origin of the values
that are so produced (NS, 28-Jul-2012, p28).
Wave-functions are either a manifestation of some (ontological) underlying mechanism
(possibly right down to the It-from-Bit view),
or they are merely human tools or (epistemic) abstractions that help us to the correct answers (NS, 01-Apr-2017, p41).
Recent experiments rule out many classes of the latter view (NS, 07-Feb-2015, p14).
However, there surely must be a deeper,
underlying mechanism that has yet to be discovered (NS, 23-Jun-2007, p30),
though the current proposals are mutually conflicting (NS, 14-Nov-2015, p14).
Quantum Bayesianism (QBism) holds that
the wave-function is merely a summary, constructed by the human observer, of
all the observer's knowledge of the system, and hence that it is just in the
observer's mind and not a property of the quantum particle itself (NS, 10-May-2014, p32).
The most obvious Archilles heel of quantum mechanics is our lack of explanation
for why the Born rule for probability amplitudes is valid (NS, 05-Nov-2016, p8).
Various attempts have been made to explain how the probabilistic
behaviour we observe in quantum mechanics
could arise from an underlying deterministic behaviour,
including a refinement of the pilot-wave idea (NS, 22-Mar-2008, p28)
and a new way of viewing the Bohm interpretation (NS, 27-Feb-2016, p8),
and how it might even be supported by an experiment on classical waves in an oil bath (NS, 08-Apr-2017, p28).
Experiments have shown the so-called Cheshire-cat phenomenon:
that a particle can follow one path,
and its properties (such as spin)
can be split off to follow a different path (NS, 26-Jul-2014).
Even at the atomic level,
it is possible to project an image of an atom
from one focus of an ellipse
so that a virtual atom appears at the other focus (NS, 08-Jul-2000, p24).
Bell's inequality tells us
that any problem in our understanding of entanglement
is either with relativity, objective reality,
or the observer's free will (NS, 26-Feb-2011, p36; NS, 03-Aug-2013, p32).
That is, it tells us that our problems are either with the 'at a distance',
the 'action' or the 'spooky' (ghost in the machine) aspects of entanglement.
For the third of these,
it might not necessarily be the observer's free will that is in question,
but an inherent limit on our ability to close all the loopholes in the experiments
that we can perform (NS, 18-Jun-2005, p32),
though experiments have subsequently been performed
that significantly tighten up on these loopholes (NS, 05-Sep-2015, p8),
including ruling out the possibility that the measurement of the state of one of the particles
could tamper with the mechanism of the random number generator (NS, 11-Feb-2017, p7).
Experiments have been run to see if the Bell Inequality is affected
by whether conscious human minds are involved in the decision
as to which parameter to measure in each entangled pair of particles (NS, 27-May-2017, p7).
Using phase-space as our tool of preference
involves imagining everything in terms of momentum-energy,
instead of space-time (NS, 06-Aug-2011, p34).
Experiments consistently suggest that
a particle cannot be pin-pointed, at sub-atomic scales,
in 6-dimensional phase-space (x,y,z,ẋ,ẏ,ż)
because the particle does not have a specific location in that space,
with any attempt to locate it,
resulting in its position being a blurred one.
Unfortunately, all our current methods for solving the equations of motion,
from Newton, Lagrange, Hamilton and Jacobi,
have, at their roots, the concept of this phase-space,
so our ability to use these tools to work out
the movements of particles at the sub-atomic scale is somewhat hampered.
(In effect, any attempt at using dead-reckoning
to predict a particle's future position is doomed to failure, one way or the other.)
The temporal version of the Bell inequality confirm that,
given a particle's initial state and final state,
it is not clear
that it had any definite history of intermediate states (NS, 04-Dec-1993, p14).
Fourier analysis goes a long way to explaining how Heisenberg's uncertainty principle arises,
and gives Δk.Δx=1/2
as an inheritant limitation from the mathematics
(the more fine and precise a pulse is in time,
the wider the band of frequencies needed to define it)
where k is the wavenumber, or the spatial frequency, equal to 2π/λ.
Then, with a simple substitution of E=hf,
Heisenberg's Uncertainty Principle drops out:
Δp.Δx≥ℏ/2 and ΔL.Δθ≥ℏ/2.
Since the wave-function of any fundamental particle,
such as a photon or an electron,
can be considered to consist of a carrier wave (eikx)
multiplied by an envelope (e–a.x2),
Heisenberg's uncertainty principle is, in effect,
talking about the bandwidth of the particle
due to the side-bands of its amplitude modulation.
Noether's theorem, too,
already says that the two parameters in Heisenberg's Uncertainty Principle are connected
(the conservation law of one follows as a consequence of the symmetry exhibied by the other)
and hence that the two parameters are just two sides of the same coin,
so are described by one shared set of information, not two.
Heisenberg, Noether, Fourier and Bell all point in the same direction,
that it is not just that we do not have access to all the information,
but that the information simply does not exist in the first place
(for example of a particle's position and of its momentum),
and that our observations are doomed forever to being probalistic (NS, 14-Mar-2015, p28).
It is suggested (NS, 15-Nov-2014, p28)
that quantum nature is in fact just a manifestation of economy of information
(if a given effect follows from a given cause 80% of the time,
it would be inefficient for the system to encode the given cause with 100% coverage).
Perhaps this shared information of "two sides of the same coin"
arises simply whenever we try to find partial information about an entangled system,
and that quantum weirdness simply emerges
from more logical central principles (NS, 11-Apr-2015, p34).
The universe viewed as a quantum computer
If all the matter of the universe is
made of information (NS, 17-Feb-2001, p26;
and Wheeler, in Zurek 1990),
the universe itself can be considered
to be a vast quantum computer
(just as people in the previous industrial revolutions have considered the universe
to be like a giant system of wheels, a giant heat engine, or a giant conventional computer).
It follows that our ideas for quantum computing will usefully feed back into
our formulations for summaries of how the universe works.
Some suggest that quantum computing
could beat the Turing Halting Problem
(NS, 06-Apr-2002, p24; NS, 19-Jul-2014, p36).
Chaitin indicates that there is a hierarchy
of Omega numbers that would remain forever non-computable (NS, 10-Mar-2001, p28),
though this is later questioned by a demonstration of the computation of the
first 64 bits of Omega (NS, 06-Apr-2002, p27).
Quantum Computing will lead to a more general form of the Turing Halting Problem.
Whether any new restrictions count as different,
or a mere rewording of the original halting problem,
might just be a matter of taste.
There is even a proposal to consider a Quantum Gravity Computer (NS, 31-Mar-2007, p30).
In such a machine, output does not necessarily need to follow input.
"GR says that the causal structure can vary
(since there is no such thing as 'simultaneous'),
and QM says that anything that can vary can be in a superposition."
It is possible to consider Moore's law
continuing up to the Planck limit (NS, 03-Sep-2000, p26).
Chaotic boundaries between stasis and randomness
Crutchfield and Young propose a way to measure complexity.
Intriguingly, quantum mechanics is positioned at the fine
boundary of criticality, between classical physical behaviour and weird
interconnectedness (NS, 26-Feb-2011, p36).
This boundary appears to be a pre-condition of interesting (non-intuitive) chaotic behaviour.
It features in explanations of consciousness (NS, 26-Apr-2014, p44),
and perhaps even of the behaviour of society (NS, 06-Jul-2013, p26).
Therefore, consciousness and the others (each one an emergent behaviour)
can be treated as a region of phase change (NS, 12-Apr-2014, p29) and Deacon (2012).
Conway's Game of Life is mathematically interesting because it is poised on the thin boundary
between boring crystalline stasis,
and unruly gaseous randomness,
in an interesting region of chaotic behaviour.
If we were to relax or tighten any one of the game's simple rules,
the game would revert to boring uninteresting behaviour.
All the interesting things in this universe
(consciousness, society's dynamics, evolutionary life, physical properties of water)
seem to be those that teeter on this chaotic behaviour boundary between cold stasis and hot randomness.
The second law of thermodynamics, and the speed of light limitation,
are just two such rules in the game that we know as the 'Universe'.
Relax or tighten any of these,
and the universe would not work any more,
in the way that it currently does;
and atoms, stars, planets, and sentient beings would cease to be possible.
Wave-function collapse and decoherence
When two events are correlated, there are three possible explanations:
communication, common cause, or entanglement.
With common cause and entanglement,
the two parties merely observe a correlation when they compare their measurements;
there is no communication.
It is a well-established, experimental fact that fundamental particles,
such as electrons and photons, and also composite sub-atomic
particles, such as protons and neutrons,
behave in a quantum mechanical sort of way,
and that cricket balls, wheeled trolleys, cats and humans
behave in a classical physics sort of way
(where classical physics is a shorthand for special and general relativity,
at extreme conditions,
which can be simplified to Newtonian mechanics, under more ordinary conditions).
So, one question is where (and how) the switch over occurs (NS, 17-Mar-2007, p36).
Probing the boundary region between relativistic and quantum behaviour
is, indeed, a fertile area for research (NS, 20-Apr-2013, p34).
According to the Copenhagen interpretation,
the observer causes the wave-function to collapse.
It is the act of observing the experiment
that causes states of superposition to decohere,
and that therefore the experimenter is an integral part of the experiment.
This leads to many disconcerting possibilities,
not least the existentialist one that reality is not objectively present,
but only created at the moment that someone observes it.
Furthermore, the use of the word 'someone' raises the possibility
that conscious beings have to be involved (NS, 02-May-2015, p33),
which is not only disconcerting, but problematic,
since we still do not have a definition of consciousness,
even now, after several millennia of investigation.
In asking what reality is,
either we define it in terms of objective things,
stuff that can be sensed and known to be out there,
or we can try a reductionist approach
of trying to build everything on some lowest level (NS, 29-Sep-2012, p34);
but both approaches have their problems.
Both approaches end up going round in a circle, with
macroscopic objects (not least human brains) at the end: in the former
approach, consciousness is the property that causes wave-functions to collapse
one way or the other, and not remain in superposition; while in the latter
approach, reality is based on a substrate of mathematics (NS, 21-Nov-1992, p36) and Tegmark (2015).
One approach might be to propose
that the wave-function model is not a complete description of reality;
but decades of rigorous experiments have shown
that any additions would have to involve faster than light communication.
Another approach might be to propose that wave-function collapse does not happen at all,
as in the Many Worlds interpretation.
Lastly, a model of objective collapse can be proposed (NS, 16-Jul-2016, p30)
in which spontaneous collapse occurs, with no observer required
(proposed by Pearle, Ghirardi, Weber, Rimini in the 1970s,
and made compatible with general relativity by Bedingham and Tusmulka).
Penrose proposes that gravity could be the cause.
Heisenberg's original explanation for the uncertainty
principle, that the act of observation involves bouncing at least one particle
off the object that is being observed, thereby disturbing it, was somewhat
undercut by the experimental verification of Bell's inequality,
but it does serve a second purpose
of being a possible explanation for the triggering of wave-function collapse.
That is, interaction with the noisy environment causes wave-functions to collapse,
and for particles to become discrete objective entities.
A black-hole swallowing of information can be explained,
with collapsing of wave-functions inside the event-horizon needing no observer.
Gell-Mann and Hartle describe a mechanism
involving matrices that become successively dominated, at each particle-particle interaction,
by the terms in the leading diagonals.
Thus, it is the repeated particle-particle interactions that lead to the decoherence.
Observations and measurements are independent of this,
but (as it happens) first require there to have been plenty of particle-particle interactions.
Buffeted by the solar wind,
Mars is at one specific position in its orbit round the sun,
not smeared out probabilistically in superposition all the way round;
and even specks in the depths of space
are being bombarded by photons of the 3K cosmic microwave background.
To have the values of each particle's matrix nudged to new values, at each collision,
sounds remeniscent of
how the Logistic Equation works (xn+1=r.xn.(1–xn)).
The coexistence of symmetry and chaos (NS, 09-Jan-1993, p32) is well studied.
Even this simple deterministic equation can go chaotic and, to all intents and purposes,
unpredictable (at values of r=3.7, for example),
with no implications of spookiness.
Gell-Mann and Hartle talk of "information gathering and utilising systems" (IGUS),
with possible implications on free will (p454)
and how it is not the observer (or the act of observation) that creates reality (p453).
Multiplication by the off-diagonal zeros,
implicit too in the And and Or operations of manipulating proabilities (p441),
is an irreversible operation (p452).
It comes down to how (a+b)2≠a2+b2 (p428),
and how, under superposition, the probabilities might not add up to unity.
All of the universe is interconnected, by the off-diagonal elements (p431)
and all laws in physics are approximations (p445)
but the authors seem to be against the notion of loop quantum gravity (p430).
History requires knowledge of both present data and the initial condition of the universe (p439)
and results in the notion of time (p437) via its ordering.
This leads to the mechanism of Heisenberg's uncertainty principle (p455)
and how the many worlds view should really be thought of, instead, as many histories.
There is a chain of logic leading from decoherence, to resonance,
to Dalton chemistry,
and to a mechanism of survival of the fittest (p449).
Patterns will crystallise out, but because of the randomness in the process to get there,
we cannot predict, in advance, which ones.
There are lots of parallels between decoherence in a QM system,
and evolution in an eco-system.
Each collision nudges the matrices to be in nearly diagonal form;
nearly, but never completely.
Paul Davis notes the non-locality of quantum mechanics
(with all the universe having some effect on all the rest).
He also notes that an accelerating or rotating body
should experience the glow of the quantum vacuum (NS, 03-Nov-2001, p30),
perhaps contributing to an explanation for
the apparent correspondence between inertial mass and gravitational mass (NS, 03-Feb-2001, p22).
It would also be consistent with the observation
that decoherence is not an instantaneous event,
but one that takes time (NS, 10-May-2003, p28),
and can therefore be interrupted or even reversed (NS, 12-May-2007, p32).
Logic and thinking
The adage, "if a hammer is the only tool you have in your tool-box,"
could cause us to wonder if we should be finding some new tools.
Perhaps, at the very least, the experiments are telling us
that something is wrong with our understanding,
like an experimental version of Reductio ad Absurdum in mathematics.
It might be that it is telling us that we are wrong to consider it to be weird,
and that things really do work that way;
or it might be that one (or more) of our axioms is wrong
(such as the existence of quarks and electrons, or forces like gravity).
It might be that because we design our experiments
to investigate the properties of photons,
that we appear to get answers that look like the properties of photons (NS, 24-Jul-2004, p30).
Toffoli makes a case
for why a lot of what we observe in our physics experiments
is an artefact of those experiments, or rather our model of what the universe is,
rather than intrinsic to the universe itself;
he shows that special relativity and general relativity might even be such artefacts.
Rather than our brains analysing incoming signals,
finding patterns of ever-increasing complexity,
and making sense of them by matching them against the internal representations,
it is the other way round (NS, 09-Apr-2016, p42; and also p20):
our brains generate the sensory data to match the incoming signals,
using internal models of the world (and body),
thereby giving rise to multiple hypotheses,
with the most probable one becoming tagged, Dennett-like, as being our perception.
This could be the mechanism whereby
hallucinations are signs of a correctly working brain in the absence of sufficient sensory input (NS, 05-Nov-2016, p28);
it also has parallels to the scientific method (coming up with models to explain the observed data,
actively setting out to observe new data,
and keeping the model no more complicated than necessary)
being not so much an invention of past philosophers,
as just the way that the human mind has been working, naturally, anyway.
Each hypothesis can then be refined in the light of the error signals that are generated.
The mere repeated occurence of this process
might also give the brain a continous assurance of its identity (NS, 03-Sep-2016, p33)
and to feed into other, what were previously considered metaphysical, questions (NS, 03-Sep-2016, p29).
The model of thinking,
including deductive logic,
that we have maintained since the ancient Greeks,
might need to be superseded (NS, 27-Feb-2016, p34).
Scientific method might be too stringent,
and might need to entertain theories
that will always be beyond experimental testing (NS, 27-Feb-2016, p38).
Boltzmann brains (NS, 28-Apr-2007, p33) are self-aware entities
in the form of disembodied spikes in space-time
(more common in regions of high entropy than low entropy).
Sean Carroll presents a counter-argument for their possibility (NS, 18-Feb-2017, p9).
It has been shown (NS, 15-Apr-2017, p8)
that this quirk of logic disappears under the Bohm interpretation.
Some sort of quantum Darwinism
acts so that the fittest wave-collapses predominate,
and give the impression of there being an objective reality to be measured by
independent observers (NS, 30-Jun-2007, p18).
that the laws of the universe are evolving (NS, 30-Jun-2007, p30),
and hence that the answer to a question like, 'Why these laws and not any others?'
is like asking, 'Why these species and not any others?'
Hartle considers this the other way round (NS, 01-May-2004, p34),
that we and the other species on the planet
have evolved to treat 'now' as a special concept
because the universe works that way
(for example, a force is a function of the present state,
with no memory of past or future states).
Similarly, free will is also rooted in the present
(the past has no flexibility when it comes to shaping the future).
However, memory is always involved:
as soon as a system consists of more than one component
(such as atoms in a crystal, or stars in a galaxy)
any probing (such as by a hot body, for example) of the state of the whole system
will obtain an almost immediate result from the nearest component,
but the time-delayed result from the components further away.
The speed-of-light limitation creates delay-line memory into the system,
and capacitive laws into the behaviour of the overall system.
The speed of information can be faster than the speed of light in the given medium
(faster than the group velocity),
albeit not greater than the speed of light in a vacuum (NS, 18-Oct-2003, p42).
Information is required to turn energy into work (NS, 23-Jun-2012, p8).
Deacon says that it is constraints that are required to turn energy into work,
and that constraints are measured by Shannon entropy.
Examples of hidden symmetry (NS, 03-May-2014, p36)
include a pencil that had been balancing on its point,
that could have fallen in any direction,
but in fact fell in this particular direction;
or a pack of cards that could have ended up shuffled in any order,
but whose symmetry is even broken by onlookers who single out a
particular sequence as being remarkable
(the same role that is played by survival-of-the-fittest
when the pack of cards is a randomly evolving genome).
As Deutsch and Marletto point out,
referring back to Shannon,
a message only contains information if an alternative (counterfactual) message had been possible;
and the process of wave-function collapse
(decoherence of a particle that was in a superposition of states) is another example.
Wheeler proposed, "It from Bit," (Zurek 1990),
and hence that all the matter of the universe is
made of information (NS, 17-Feb-2001, p26).
Analysis of Maxwell's Demon devices suggests
that E=S.k.T, where S is measured in nats,
or E=S.k.T.ln(2), where S is measured in bits.
Born out by the different types of engineering being based on their respective forms of energy
(within the context of the first law of thermodynamics)
that can be harnessed for energy transfer or for information transfer.
According to Davies,
the Hawking-Bekenstein formula for the entropy of a black-hole,
the information content of the universe, in nats,
is given by IU=G.(MU)2/(ℏ.c).
Further, since this would be expected to have increased to its present value,
starting at unity at one planck time after the big bang,
this would lead to
Since the universe is now 1060 planck times old,
this gives an information content of about 2x10121 nats,
consistent with it having a mass of about 1053 kg (NS, 16-Dec-2000, p26).
Formation of structure in the presence of energy flow
Perhaps we need a fourth law of thermodynamics (NS, 29-Oct-2005, p51).
Fourier's, Newton's and Stefan's laws of cooling start to put numbers on the
throughput of the energy transfer of the second law of thermodynamics, while
the velocity of light limitation of special relativity puts numbers on the
latency of that energy transfer.
To come up with a new law of thermodynamics,
others have suggested this.
- Putting aside the 0th and 3rd laws for the moment
- The 1st law says that energy cannot be created or destroyed, just shuffled about
- The 2nd law says that energy naturally flows from hot bodies to cold bodies
- The 4th law would say how fast that flow might be expected to be
- The 5th law would say how fast new structures (braid plains in the sand, DNA in the genome) build up.
What I think of as the 4th law is already answered, of course:
- Fourier says that heat is conducted from hot bodies to cold ones
at a rate proportional to the temperature difference
- Newton says that heat is convected from hot bodies to cold ones
at a rate proportional to the temperature difference
- Stefan says that heat is radiated from hot bodies to cold ones
at a rate proportional to the temperature^4 difference
So, the law would really be concerned with the values of those constants of proportionality.
From experience, such as from microprocessor chips on PC motherboards,
we know that the constant for radiation is very small,
and dwarfed by any conduction routes that are available to the heat,
and that the constant for convection is even bigger.
For the 5th law, the Deacon book suggests that maybe this needs to be split in two (so maybe a 5th and 6th law).
He has a name (teleological) for systems
that build up structure to flow-away the available heat difference as fast as possible,
like convection currents, Bˇnard cells, and my braid plains in the sand;
and other for the next level of complexity up from that,
of systems that try to conserve the reservoir of energy difference for their own perpetuation (like living cells).
For the first two,
the constant of proportionality, C, is a function of the ambiant temperature, C(T).
The indication is that the constant of proportionality is also punctuatedly, but monotonically,
dependant on time, C(T,t),
The simplest single-celled microbes,
and also for their pre-cellular precursors,
and perhaps even non-biological matter
would find itself being structured to make use of this source of energy (NS, 18-Mar-2017, p11).
Already, in the actual universe,
complete with a functioning second law of thermodynamics,
the universe is far from its equilibrium state,
thereby allowing structures to form
that aid the flow of energy towards the universe's equilibrium state.
Examples of these spontaneously forming structures include
convection currents (NS, 21-Jan-2012, p35),
braid plains (NS, 02-Sep-2000, p97),
planetary weather systems (NS, 06-Oct-2001, p38),
and DNA-based life (NS, 09-Jun-2001, p32).
Viewed in this way,
humans are just the latest inadvertently evolved structures
that help the universe use up its surplus energy supplies (NS, 05-Oct-2002, p30),
which is a role that human beings seem to be taking on with great enthusiasm.
Tsallis proposes a formula for computing the entropy of an out-of-equilibrium system
that happens to give the correct power-law answers (NS, 27-Aug-2005, p34).
probabilities are expressed as pq,
which generates Boltzmann statistics for systems that are close to equilibrium,
with q close to 1,
but also seems to work for higher values of q
that have external energy sources and that are far from equilibrium.
It is noted that other phenomena complicate the situation:
in its rush to use up the available energy as quickly as possible,
temporary, local structures form (NS, 21-Jan-2012, p35).
Convection currents set up an
orderly one-way flow to streamline the transfer of heat in a fluid that is
being heated at one end;
weather patterns form in planetary atmospheres;
and water flowing down a sandy beach
(fed by a lake at the head of the beach, for example)
builds up meanders, deltas and braid-plain structures (NS, 02-Sep-2000, p97).
The proposed extra law,
for the open, out-of-equilibrium system,
along with its emergent behaviour (NS, 17-Aug-2013, p28),
might take the form of a 'principle of increasing complexity'
that explains how quickly life evolved (NS, 09-Jun-2001, p32),
and consciousness too,
over and above Gould's observation as to why organisms tend to become more complex,
all as agents to use up the available energy faster (NS, 05-Oct-2002, p30).
In any haphazard system, unstable
structures will quickly die away (by definition), leaving the more stable ones
to persist (also by definition),
established like the nodes in a standing-wave,
giving the impression of a 'non-random eliminator' (NS, 07-Jan-2006, p36).
One strong candidate for the title of 'fourth law of thermodynamics'
would be the Principle of Maximum Entropy Production (NS, 06-Oct-2001, p38),
not too unlike the electronic engineering concept of 'matching',
that allows maximum power transfer (energy flow) from an input to an output.
The new law would need to indicate the rate at which structure is built up
(the rate at which sand grains are moved around by river flows,
and the rate at which complexity builds up in DNA-based genonomes,
perhaps as considered by Lloyd)
in the presence of a given amount of energy flow under the second law of thermodynamics
(like a sort of Strouhal number for energy flow).
Zurek describes how S=H+K,
where H is the Shannon entropy of the given sequence (statistical uncertainty),
and K is the algorithmic complexity (algorithmic randomness),
with a gradual decrease in the former balanced by
a corresponding increase in the latter
(but only made possible in a far-from-equilibrium universe, such as ours).
proposes that natural selection,
and the processes of evolution, act on K,
to keep it as low as possible,
but that it is only H that allows machines to do work,
and hence that for Maxwell's Demon, E=H.k.T.ln(2).
Bennett showed that it is the act of erasing a memory bit that incurs the cost.
Blank memory is like a cold sink at absolute zero,
with the lowest Shannon entropy, H,
even though writing useful memory subsequently increases the algorithmic information content,
and reduces the algorithmic entropy, K.
Sagawa and Ueda took this further,
and proposed (born out by experimental results)
that an extra term needs to be added to account for mutual information,
to account for the way that
the act of measurement leads to
a correlation between the system and the apparatus or its memory (NS, 14-May-2016, p28).
Human activities can be viewed as convection currents in the biosphere,
spontaneously setting themselves up
to release energy that had been blocked or locked away in fossil reserves,
to smooth the flow of ever greater amounts of energy from the hot end
(the sun in our case, beating down on our planet's surface)
to the cold end (ultimately the CMB).
Similarly, the biosphere is a convection current in the inorganic atmosphere,
which, in turn, has its meteorology.
It appears that all these various levels of convection currents take on a fractal structure.
It is also possible to identify intermediate, finer-grain levels:
within human activity,
there are the moments
that the Agricultural, Industrial and Information Revolutions were sparked;
and within the biosphere,
there are successive nested levels caused by
the evolution of amphibians from fish,
reptiles from amphibians, and, specifically,
the emergence of eukaryotes in the first place.
(Compare this with Deacon's teledynamics, for energy channelling,
versus morphodynamics, for energy sequestering.)
Many argue that life in general, and RNA life in particular,
are extremely easy to get started (NS, 20-Aug-2016).
However, there is a suggestion that even revised versions of Drake's equation (NS, 25-May-2013, p6),
complete with revised definitions of the habitable zone (NS, 08-Jun-2013, p40),
are not yet allowing for the extreme
unlikelihood of the re-emergence of multicellular life (NS, 23-Jun-2012, p32)
precisely on the grounds of the laws of thermodynamics,
and the need for a eukaryote-like serendipitous discovery of how to farm mitochondrial energy
packs, either by phagocytosis or by extrusion from the inside (NS, 14-Feb-2015, p28).
Others, though, suggest that the step from single cell to
multicell organisms was not such an unlikely event after all (NS, 27-Jun-2015, p11).
Emergence of the Physical laws
Ever since the universe was more than 5.39x10–44 seconds old,
the universe appears to have adhered to all our current laws of physics.
Since the laws of physics of the universe reside inside the universe,
not as some abstract concept outside,
they can only be as precise
as can be calculated from the total information content of the universe.
Allowing our theories to work with an infinite number of numbers,
quotable to an infinite number of decimal places,
shows that mathematics is an approximation of the universe
rather than the other way round (NS, 17-Aug-2013, p32).
Deutsch and Marletto propose a way
in which the laws of physics arise as 'constructors' that work on information
in a quantum computing view of the universe (NS, 24-May-2014, p30).
Not only would this offer an explanation for how the laws of physics arise, but it could
also give an explanation of what constitutes knowledge (NS, 08-Apr-2017, p30),
and the role of a knowledge creator (like a conscious mind).
Since the universe's information content is limited by its size:
in the early universe, immediately after the big bang,
the universe was still small, and more amendable, by retrocausality, for example,
to be fine-tuned for the conscious beings that would eventually evolve to observe it.
In retrocausality versions of the double-slit experiment (NS, 30-Sep-2006, p36).
The photons pass the shutter with the double-slit in it,
and then traverse the distance towards the screen.
This takes enough time, because of the speed of light limit,
that the experiment can be changed between being interested in photographing the screen,
where a pattern of wave-interference fringes would be set up, or photographing the slits,
where the photons can be witnessed as particles passing through one slit or the other.
Power is the rate of change of energy in time,
and, by the same token, force is the rate of change of energy in distance;
but force is also rate of change of momentum in time.
The following lattice carries this further,
with the aim of illustrating one aspect of
the correspondence between space and time:
| || || || || || || || |
| ||Z|| |
| ||/ \|| |
| ||L||mx|| |
| ||/ \||/ \|| |
| ||E||p||m|| |
| ||/ \||/ \||/ \|| |
The lattice has been arranged so that by starting at any given node,
travelling SW involves taking the d/dt, travelling SE involves taking the d/dx
(and vice versa with integrals for travelling NE and NW).
In the lattice: P=power, E=energy, F=force, p=momentum, m=mass, L=action,
mx=mass*distance, X=dp/dx=dm/dt, Y=dm/dx and Z=∫L.dt=∫m.x.dx.
For over a century,
the academic world has concentrated on the role of the specialist:
experts who know more and more about a narrower and narrower field.
This is inevitable;
the capacity of each human mind is finite,
so the depth of the knowledge can only be increased if the width is restricted.
However, the world still needs generalists,
whose knowledge is not so deep, but integrated over a far wider span of subjects.
Such people have, at their disposal, a very powerful method of problem-solving,
and are able to find solutions to problems by analogy and cross-pollination across otherwise unrelated subjects.
In addition, they are, indeed, also the integrators,
bringing the work of the specialists together to the benefit of society.
Although not a formal academic source,
the New Scientist magazine is a particularly appropriate one to use as a resource for this.
As a weekly magazine in popular science,
it provides an overview across a wide spectrum of the newest developments in science and engineering
(and, in any case, can, of course, be followed up later in the more formal academic references).
The aim is to find a "whole that is greater than the sum of the parts",
but keeping aware that this will sometimes be confounded by Gödel-like inconsistencies.
Bootstrapping the laws of physics
We are fortunate in the way that inductive logic works in our universe:
the chemical experiment or the clinical trial that we do today,
in the controlled environment of the laboratory,
are valid indications of how they will work tomorrow in the shopping mall.
Not only are these symmetries already very useful in their own rights,
but Noether showed that they lead to the conservation laws:
the fact that the laws of physics apply equally
when displaced in space (the Copernican principle)
leads to the law of conservation of momentum;
the fact that the laws of physics apply equally
when displaced in time leads to the law of conservation of energy (the first law of thermodynamics).
Taken together, these lead to the second law of thermodynamics
(two balls into a Newton's cradle leads to two balls out, rather than one ball twice as high,
or three balls two-thirds as high, for example).
With Newton's cradle,
it all gets sorted out in the split second when the shock wave of the incoming balls
causes all five balls to bounce around off each other like the balls in a Landauer reversible computer.
A Newton's cradle made of 56g of iron in the five balls
would have 56*3*6x1023 quarks and 26*6x1023 electrons
(1x1026 quarks and 1.6x1025 electrons).
It follows that the particles in a Landauer computer are without a sense of time,
in their reversible collisions,
but it is the levels of abstracted behaviour above that
that experience time, thinking and consciousness, as emergent behaviour.
(The contribution of entanglement and leading-diagonal decoherence might affect the conclusion
for the Landauer reversible computer, though.)
This then implies Heisenberg's uncertainty principle, as noted earlier,
by expressing it in terms of information theory (NS, 23-Jun-2012, p8).
Indeed, the thought experiment of throwing a stone into a pond already hints at this connection.
The kinetic energy of the stone arriving at one point on the pond's surface
means that that energy is initially concentrated,
and will either tend to diverge from there
by the statistical mechanics of the second law of thermodynamics,
or by the uncertainty in duration for the concentration of energy at a restricted point.
This centres either on a E=kT or E=hf relationship, respectively.
once launched divergently,
the energy continues to flow in a state of motion in a radial straight line, unless acted on by another force
(Newton's first law of motion, but as modified by Einstein and by Schrodinger's equation).
Heisenberg's uncertainty principle then implies vacuum energy
(the uncertainty of the energy, and hence the energy itself, can never reach zero)
and also the third law of thermodynamics
(the uncertainty of the temperature, and hence the temperature itself, can never reach zero (NS, 18-Mar-2017, p10)).
The existence of the vacuum energy might then have implied dark energy,
had it not been a massive 120 orders of magnitude out in its experimental predictions (NS, 01-Nov-2003, p34).
The 120 orders of magnitude discrepancy might be due to a missing leakage term (NS, 27-May-2017, p28).
Special relativity follows from the Principle of Relativity
(constant motion of the system
cannot be determined by observations made completely inside the system),
and the Relativity of Simultaneity
(simultaneous events in one context are not simultaneous in another).
General relativity follows from generalising the Principle of Relativity still further
(acceleration cannot be distinguished from a gravitational field).
Peres showed that the axioms of quantum mechanics
necessarily lead to the second law of thermodynamics,
and to Schrodinger's equation being linear.
In line with,
but slightly contrary to the article in the 13-Oct-2012 issue of New Scientist
magazine, Noether's Theorem (or perhaps the principle of least action) might, instead,
constitute the long-sought theory of everything.