In 2012, history was written. CERN’s Large Hadron ColliderFootnote 1 (LHC) had detected the signature of an elusive new particle in the deep fabric of reality. This revolutionary finding confirmed the last and final missing particle anticipated by the hugely successful standard model of particle physics (Sect. 4.4). This amalgamation of ideas (Sects. 4.2 and 4.3) predicted the existence of the Higgs boson, based on what is called the Higgs mechanism (Sect. 4.2.1), a theory developed in the 1960s. The following statement can be read on CERN’s webpageFootnote 2:

On 4 July 2012, the ATLAS and CMS experiments at CERN’s Large Hadron Collider announced they had each observed a new particle in the mass region around 126 GeV. This particle is consistent with the Higgs boson predicted by the Standard Model. The Higgs boson, as proposed within the Standard Model, is the simplest manifestation of the Brout-Englert-Higgs mechanism.

Indeed, a momentous discovery. Again, from CERN’s webpage (See footnote 2):

On 8 October 2013 the Nobel prize in physics was awarded jointly to François Englert and Peter Higgs “for the theoretical discovery of a mechanism that contributes to our understanding of the origin of mass of subatomic particles, and which recently was confirmed through the discovery of the predicted fundamental particle, by the ATLAS and CMS experiments at CERN’s Large Hadron Collider.”

Unfortunately, the fourth of July 2012 was a bad day for physics. In the words of mathematical physicist Peter Woit (quoted in Brockman 2015, p. 72.):

The observation at the LHC of the Higgs [...] has caused great consternation among theorists. Something has happened that should not have been possible according to the forty-year-old reasoning now well-embedded in textbooks.

In essence, the discovery of that particular flavor of Higgs boson was the worst observation possible. It confirmed, and fully exposed, a deep schism between theoretical physics and reality. After all the breathtaking success of physics in decoding the intimate workings of reality in the last century (Chaps. 3 and 4)—indeed, after over three centuries of unstoppable triumphal procession (Chap. 2 and Sects. 5.1 and 5.3)—the whole abstract machinery threatens to grind to a halt. At the core of this dissonance lies the apparent impossibility to construct a quantum theory of gravity. Quantum gravity, unexpectedly, emerged as the elusive, but highly anticipated, holy grail of physics, as it would represent the last missing step fully unifying all the physical forces in the universe (Sect. 4.3)—a neat “theory of everything.”

The standard model of particle physics, albeit being an incredibly accurate theory, does not include gravity in its mathematical representation of reality. Theoretical physicists have been grappling with this omission since the late 1960s, when string theory was born (Sect. 4.3.2). However, for string theory—and M-theory, its modern incarnation—to work, reality has to display some very particular properties (discussed below). Disappointingly, the “plain vanilla” Higgs particle that was discovered “threatens to close a chapter of 20th century physics without a hint of how to start writing the next page” (Cliff 2013). We are stuck with two spectacularly accurate fragments of isolated knowledge which simply wont mesh. The standard model and general relativity (introduced below) are at insurmountable odds with each other and no experimental hint is in sight. We are left in the dark, knowing that the cone of light representing our knowledge is only illuminating a limited part of reality. Ignorance abounds.

The problem of quantum gravity is, however, only one of the failings which are appearing to bring modern theoretical physics to its knees. This chapter will illuminate this crisis in understanding. In light of these revelations, it should be expected that even the most sympathetic defenders of knowledge can acknowledge the feelings of gloom expressed in the last chapter. Namely, that certainty appears futile, explanations seem useless, and all knowledge is ultimately based on that which we cannot prove. Every answer we pry from nature is met by the appearance of a handful of deeper and harder questions (Sect. 9.2.3). Science never truly was the endeavor to unearth the “absolute truth,” but represents an incremental, approximate, and fallible approach to reality (Sects. 9.1.6 and 9.3). Indeed, science is a complex social human undertaking (Sect. 9.1.3 and 9.1.5), plagued by all the shortcomings affecting any human effort to organize and collaborate (Sect.  9.2.2). Finally, mathematics is inherently flawed, rendering it a questionable foundation for science (Sect.  9.4). The clouds on the horizon (Chap.  8) have become frighteningly dark skies.

Before addressing the challenge of quantum gravity and beyond, a selection of open questions in physics is presented. This should convey the scope and depth of the problems facing the human mind in its quest to comprehend the universe. Perhaps the following questions can never be answered:

  • Why do three spatial dimensions appear to exists?

  • Why does the nature of space and time depend on how it is observed from a reference frame (i.e., the malleable fabric of space-time)?

  • Why the quantum nature of the atomic realm?

  • Why is the quantum realm so utterly bizarre and alien to our conceptualization?

  • Why are the values of the fundamental constants what they are?

  • What principle lies behind the self-organizing structure formation seen at all scales?

  • Why the zoo of elementary particles? Indeed, when the plethora of new subatomic entities emerged, a Nobel laureate quipped, “who ordered that?”

  • Do protons decay?

  • Why is there an arrow of time?

  • What is the nature of time?

  • What physics lies at the heart of the mathematical singularities incapable of penetrating reality any further?

  • What happened at (or even before) the Big Bang?

  • Is our universe infinite or finite in extent?

  • Why does the universe appear to be left-handed, harboring left-handed life (i.e., the origins of chirality)?

  • Why all the cosmic coincidences (Sect.  8.1.3)?

More specifically and technically:

  • Why is the universe not made up of equal parts of matter and antimatter (Sakharov 1967), as the Big Bang produced an equal mixture of both? [Baryogenesis, baryon asymmetry]

  • Why is there an anomaly appearing in the cosmic microwave background radiation which appears to give special significance to the location of Earth within the entire universe (Sect. )? [Unfortunately named the “axis of evil”]

  • Why does one of the oldest galaxies ever to be observed (EGSY8p7) appear to contradict the current cosmological narrative of the universe (Zitrin et al. 2015)?

  • What is the origin of the dark spot detected in the cosmic microwave background radiation (Cruz et al. 2005), which appears unexplained within the standard cosmological model (Mackenzie et al. 2017)?

  • Is the origin of gold and other heavy elements due to the collision of neutron stars (Perkins 2018)?

  • What is the nature of the energy density of empty space? [Dark energy; see Sect. 10.3.1]

  • What is the nature of the majority of the unknown matter content of the universe? [Dark matter; see Sect. ]

  • What is the connection between information and black holes? [Black hole information paradox; see Sect. 13.4.1]

  • Why is gravity so weak? [Hierarchy problem; see in the following text]

  • Why do we observe homogeneity of causally disconnected regions of space? [Horizon problem; see in the following text]

As expected, many potential answers to these questions have been offered. For instance, the dimensionality of space could have a mathematical underpinning, related to the emergence of complexity. Distortionless wave propagation is only possible in an odd number of dimensions and radially symmetric wave propagation can only occur in one or three dimensional space (Morley 1985). Furthermore, the strength of gravity in three dimensions depends on the distance squared between massive objects. In two dimensions, it depends only on the distance, whereas in four dimensions it is related to the distance cubed. In essence, in a two-dimensional world gravity would be too strong, and in four dimensions too weak, for the formation of complex structures in the universe. Or perhaps the dimensionality of space is constrained by the second law of thermodynamics and entropy (Gonzalez-Ayala et al. 2016). Even more intriguingly, the four-dimensional fabric of space-time has very special properties. Mathematically, it is described by a manifold. In general, equipping manifolds with so-called smooth structures allows for rigorous mathematical analysis on them. Space-time, i.e., the abstract 4-manifold representing it, allows for infinitely many (i.e., an uncountable number) such smooth structures. In all other dimensions there only exists a finite number. Then, the solution to the smooth Poincaré conjecture has been proven in all dimensions other than four. In a similar vein, expressed in the technical language of topology, for a four-dimensional cobordismFootnote 3 defined on 4-manifolds, it is unknown whether a specific theoremFootnote 4 holds. Does abstract mathematical richness translate into emergent physical complexity (Donaldson and Kronheimer 1990; Friedman and Morgan 1998; Scorpan 2005)?

For the hierarchy problem, supersymmetry (Sect. 4.3.2) has been invoked. This new symmetry property of realityFootnote 5 is also a prerequisite for string theory. Indeed, many physicists had hoped that the LHC would produce some evidence of supersymmetry. Finally, the horizon problem in cosmology is address by what is known as inflation. This is a postulated exponential, but extremely brief, expansion of space in the early universe, around \(10^{-36}\) seconds after the Big Bang singularity (Guth 1981; Collins et al. 1989; Peacock 1999; Peebles 1993; Penrose 2004). A generic explanation for all the apparent coincidences and opaque aspects of existence is the Anthropic Principle. It simply states that all theories of the universe must be constrained by the necessity to allow human consciousness to emerge. For instance, in the words of Andrei Linde, known for his theories on cosmic inflation (quoted in Brockman 2015, p. 46):

There are many strange coincidences in our world. The mass of the electron is 2,000 times smaller than the mass of the proton. Why? The only “reason” is that if it were even a little different, life as we know it would be impossible. The masses of the proton and neutron almost coincide. Why? If the masses of either were even a little different, life as we know it would be impossible. The energy of empty space in our part of the universe is not zero, but a tiny number—more than a 100 orders of magnitude below the naïve theoretical expectations [zero-point energy]. Why? The only explanation we have is that we couldn’t live in a world with a larger vacuum energy.

This ludicrous discrepancy between the observed density of the vacuum and the calculated zero-point energy of quantum fields prompted the Nobel laureate Steven Weinberg to call it “the worst failure of an order-of-magnitude estimate in the history of science” (quoted in Jones and Lambourne 2004, p. 355). The core of this enigma relates to the failings of the human mind in conceiving a quantum theory of gravity, a drama unfolding on the main stage of theoretical physics for decades.

1 The Worst Prediction in Physics

Totally empty space is not empty at all. This is a consequence of one of the fundamental, and strange, laws of quantum mechanics. Heisenberg’s uncertainty principle describes this behavior, which is related to knowledge, information, and, evidently, certainty. The uncertainty principle states that there exists a fundamental limit to the precision with which certain pairs of physical properties of a particle can be known (Heisenberg 1927). This lack of information is, however, not due to any lack of human ability or ingenuity, but represents a fundamental limit to how much knowledge reality is willing to reveal. For instance, time and energy are two such complementary pairs of properties. The smaller the time window is defined in which a particle is observed, the less certain we can be of its energy state during that time. This is mathematically codified as the time-energy uncertainty relation

$$\begin{aligned} \varDelta t \varDelta E \ge \frac{\hbar }{2}. \end{aligned}$$
(10.1)

In a vacuum, all quantum fields are in their zero-energy state and hence no particles are manifested. However, the loophole of the uncertainty principle allows for the temporary manifestation of particles, which exist only so briefly as to not violate it. These vacuum fluctuations represent an inherent fuzziness in the amount of energy contained at every point in space: The quantum vacuum is a seething ocean of activity. As a result, the energy content of empty space—the vacuum energy density—and the lowest energy a quantum field can have—the zero-point energy—are both larger than zero. However, there turned out to be an extraordinarily large discrepancy between these two values (Adler et al. 1995). In Weinberg’s view, this represented the worst failure of any scientific estimate.

The unexpected energy density of the vacuum has important consequences for general relativity (describing gravity) and cosmology. Indeed, it caught Albert Einstein off guard and leads to one of the greatest mysteries of the cosmos, represented by the cosmological constant (and dark energy), discussed below. On the other hand, the huge divergence between the theoretical and empirical values of the vacuum energy exposed a glaring flaw in quantum field theory, one of the most successful theories known to the human mind. Today, this is one of the greatest unsolved problems in physics. In essence, we desperately need a theory of quantum gravity to resolve these enigmas.

1.1 The Quantum Field

In the formalism of quantum field theory, space itself is comprised of fundamental quantum fields, one for each type of existing elementary partice. Vibrations in theses fields manifest themselves as physical entities. For instance, an observed electron is, in the language of this formalism, simply a localized oscillation in the corresponding quantum field. In Fig. 4.1 all the existing particles, described by quantum fields, are listed. The matter fields are classified as fermions due to their half-integer spin. These are the quarks (making up all composite matter, like neutrons or protons) and leptons (e.g., the electron). The three non-gravitational forces are associated with (gauge) bosons, carrying spin 1. Virtual photons (\(\gamma \)) mediate the electromagnetic force, virtual gluons (g) the strong nuclear force, and virtual Z and \(W^\pm \) bosons the weak force. The Higgs particle (h) is a (scalar) spin-0 boson, responsible for generating the mass of particles via the Higgs mechanism (Sect. 4.2.1).

To compute the energy density of the vacuum in quantum field theory, the following intuitive reasoning is used. An energy density is generally defined as the energy per volume. As every point in space represents a potential particle oscillation in quantum field theory, all such zero-point energy contributions need to be summed up. This can be analytically expressed utilizing the oscillation frequency \(\omega \) of all possible oscillators, yielding the energy density of the vacuum in quantum field theory to be

$$\begin{aligned} \rho _{\text {qft}} \propto \int _0^{\tilde{\omega }} \omega ^3 d\omega , \end{aligned}$$
(10.2)

where \(\rho _{\text {qft}}\) depends on a frequency cut-off \(\tilde{\omega }\) required to make the result finite. Frequency and energy are fundamentally related concepts and are linked via the Planck-Einstein relation \(E=\hbar \omega \). The Planck energy \(\bar{E}\) represents the energy scale at which elementary particles are also expected to be affected by general relativity. It is thus the likely threshold of quantum gravity. Inserting the associated frequency into (10.2) results in a vacuum energy density of

$$\begin{aligned} \bar{\rho }_{\text {qft}} \approx 10^{76} [\text {GeV}]^4 \approx 10^{114} \; [\text {erg}/\text {cm}^3]. \end{aligned}$$
(10.3)

See Rugh and Zinkernagel (2002) for details. This huge value was initially skeptically acknowledged by physicists. However, the true absurdity of that number only became apparent after it was possible to empirically estimate the vacuum energy. These turns of events astonished physicists. After all, quantum field theory had made one of the most accurate predictions in science: the Lamb shift (Lamb and Retherford 1947). Furthermore, in the history of quantum field theory all the appearing problems in the formalism could always be reconciled in some way. Unfortunately, not this time with the vacuum energy.

Indeed, quantum field theory has always been a messy affair. Vastly complex calculations emerged from its mathematical underbelly and often, these led to meaningless infinities. The first attempts to tame the complexity came in the form of approximations. Perturbation theory allowed physicists to find solutions to problems, by starting from the exact solution of a related, albeit simpler problem. The exact value is approached by adding many small perturbations. However, infinities still plagued the formalism. In a next step, a trick was utilized to tame these as well. Renormalization is a collection of techniques which capture the infinite terms from quantum field theory in finite experimental numbers. However, for every infinity to be treated, laboratory measurements are required. For details on perturbation theory and renormalization, see, for instance Peskin and Schroeder (1995). Finally, the breakthrough came from an unexpectedly simple, and somewhat strange, approach.

In 1942, a young Richard Feynman presented his thesis in which he offered a novel interpretation of quantum mechanics (Feynman 1942). This work laid the foundation for what became known as the path integral formulation (Feynman 1948; Feynman and Hibbs 1965). It is a description of quantum theory that generalizes the action principle of classical mechanics. This action is defined as an integral over time, taken along the path of the system’s evolution

$$\begin{aligned} \mathcal {S} = \int _{t_1}^{t_2} L dt, \end{aligned}$$
(10.4)

where L is the Lagrangian describing the system (Sect. 3.1.1). By minimizing the action the equations of motion can be derived.Footnote 6 In effect, Feynman’s quantum paths track all possible paths between two locations, where each path adds to the probability amplitude.Footnote 7 Of all the infinite potential paths a particle can take, most cancel out and only observable ones remain. Loosely stated, the path integral approach is like a modified double-slit experiment, where there are infinitely many slits on infinitely many screens.

Inspired by this success, Feynman ventured on. If all potential paths need to be considered between two locations for the proper dynamics of quantum particles to emerge, why not consider all possible events unfolding between measurements to understand interactions? By postulating that all events that could occur between measurements will occur, the fundamental key to quantum field theory was found. The exact mathematical expressions corresponding to this somewhat hand-waving assertion are found in the infamous Feynman diagrams (Feynman 1949; Veltman 1994). In essence, the elementary diagrams are shorthand for the exact mathematical phrases. Now these compellingly simple diagrammatic rules guide the incredibly intricate mathematics of quantum field calculations. Moreover, Feynman diagrams plus renormalization solve the problem of the bothersome infinities and yield highly accurate calculations. A key ingredient in Feynman diagrams is the notion that a positron (the electron’s antiparticle) is understood as being an electron moving backwards in time.Footnote 8 Moreover, virtual particles, existing in a meta-reality below the threshold of the uncertainty principle, are the drivers of the interactions in quantum field theory. See Fig.  for an example of a Feynman diagram. It corresponds to the following contribution to the total probability of two electrons scattering

$$\begin{aligned} \mathcal {M}=\bar{u}_1 ie\gamma ^{\,\mu } u_1 \frac{-ig_{\mu \nu }}{p^2} \bar{u}_2 ie \gamma ^{\nu } u_2, \end{aligned}$$
(10.5)

where \(u_i\) represent the initial electron quantum states, \(\bar{u}_i\) the final ones, each vertex contributes an interaction term \(ie\gamma ^{\,\sigma }\), and \(-ig_{\mu \nu }/p^2\) describes the virtual photon (Peskin and Schroeder 1995). In the end, our interpretation of the entities and mechanisms appearing in the Feynman diagrams is irrelevant. Only the topology of each diagram is relevant and has physical relevance. In other words, every vertex contributes to the probability amplitude.

Fig. 10.1
figure 1

One possible Feynman diagram for two electrons \(e^-\) scattering by interacting via a virtual photon \(\gamma \). The corresponding mathematical expression is given in (10.5)

Out of this framework, modern quantum field theory emerged, the most predictive formulation of quantum mechanics (Kaku 1993; Peskin and Schroeder 1995; Ryder 1996). A fertile ground from which the quantum theory of electrodynamics sprang, describing all interactions involving electrically charged particles by means of the exchange of photons (Feynman 1985). Later, quantum chromodynamics blossomed, a theory describing the strong interaction between quarks and gluons, the fundamental particles that make up composite matter (hadrons) such as protons and neutrons (Greiner et al. 2007). The culmination of all non-gravitational forces into a single quantum field framework is the standard model of particle physics (Sect. 4.2, especially the Higgs mechanism seen in Sects.  4.2.1, 4.3 and 4.4). Yes, those were the days, when theoretical physics progressed like a puzzle being assembled, where every new piece neatly fit into the growing whole. After this spectacular success, no wonder physicists expected the quantum theory of gravity to be around the corner.

On a side note, quantum fields were encountered at different stages in the narrative of this book. For instance, appeared in the contexts of Noether’s theorem (Sect. 3.1.4, the Lorentz group (Sect. 3.2.2.1), and the history of gauge theory (Sect. 4.2). An overview of the conceptual developments of field theory—from the field concepts in general relativity to quantum and gauge fields —can be found in Cao (1998).

1.2 Einstein’s Biggest Blunder

How could physicists gauge how bad the calculation of the quantum field vacuum energy really is? In other words, what should \(\bar{\rho }_{\text {qft}}\) be compared to? The answer comes from cosmology and it is associated with a telling story in the development of general relativity.

The theory of general relativity is perhaps the most aesthetically pleasing theory in physics (Einstein 1915; Misner et al. 1973). It expresses deep intuition about the workings of realityFootnote 9 in the language of differential geometry (Fig. 5.3). Next to quantum field theory, it is the most accurate and successful theory describing the universe. From a physical principle—the equivalence principle, Einstein’s “happiest thought of his life” (Sect. 4.1)—a mathematical formalism is developed, guided by the powers of symmetry (Chap. 3). In detail, the principle of covariance is invoked (Sect. 4.1). Einstein killed the classical force of gravity and resurrected it as the curvature in the four-dimensional space-time continuum.

The gravitational field equations are

$$\begin{aligned} G^{\mu \nu } = - \frac{8 \pi G}{c^4} T^{\mu \nu }, \end{aligned}$$
(10.6)

relating the Einstein tensor\(G^{\mu \nu } \) to the energy-momentum tensor \( T^{\mu \nu }\). The constant G is Newton’s gravitational constant and c is the speed of light in a vacuum. Usually, the energy-momentum tensor of a perfect fluid is employed in this context

$$\begin{aligned} T^{\mu \nu } = (\rho + \frac{p}{c^2}) u^\mu u^\nu - p g^{\mu \nu }, \end{aligned}$$
(10.7)

where \(\rho \) and p denote the density and pressure, respectively, of a fluid with 4-velocity \(u^\mu \). The Einstein tensor encodes the geometry of space-time

$$\begin{aligned} G^{\mu \nu } = R^{\mu \nu } - \frac{1}{2} g^{\mu \nu } R, \end{aligned}$$
(10.8)

utilizing the Ricci tensor\(R^{\mu \nu }\) and the curvature scalar R. The Ricci tensor itself is derived from the Riemann tensor \(R^\sigma _{\; \, \mu \nu \lambda }\)

$$\begin{aligned} R_{\mu \nu } = R^\lambda _{\; \, \mu \nu \lambda }, \end{aligned}$$
(10.9)

while the curvature scalar is a contraction of the Ricci tensor

$$\begin{aligned} R = g^{\mu \nu } R_{\mu \nu }. \end{aligned}$$
(10.10)

Finally, the Riemann tensor is a function of the Christoffel symbols\(\Gamma _{ {\nu }\mu \lambda }^{\nu }\) (Sects. 3.1.1 and 4.1) which themselves are defined through the metric \(g^{\mu \nu }\). In essence, the metric codifies all structural aspects of space-time, out of which the Einstein tensor draws its predictive power. See Misner et al. (1973), Peacock (1999), Peebles (1993).

However, what kind of cosmology can be derived from (10.6)? In 1917, Einstein idealized the universe as a 3-sphere uniformly filled with matter. The result of this calculation was that the radius of such a 3-sphere increases with time (Nussbaumer 2014). This was a momentous discovery, as the equations predicted the expansion of the universe. There, in the neat formal language of general relativity, the revelation of an origin to our universe was found. This shocked Einstein, as the prevailing philosophy in the Western world at the time was that “the heavens endure from everlasting to everlasting” (Misner et al. 1973, p. 409). The idea of a dynamical universe, spawning from a Big Bang, was preposterous.

In hindsight it is a tragic footnote of history, that if Einstein had been truly open-minded and had radically trusted his theory, the prediction of the expansion of the universe would have ranked as one of the most amazing scientific discoveries. In another unfortunate turn of events, the Catholic Priest and astronomer Georges Lemaître, analyzing Einstein’s equations in the context of recent observations in cosmology, postulated the expansion of the universe. He published this finding in a little known Belgian scientific journal (Lemaître 1927). The discovery went unnoticed. In 1929, Edwin Hubble empirically observed that the light originating from remote galaxies was redshifted (Hubble 1929). In other words, the more distant the galaxies were, the more shifted the light reaching us from them was. A straightforward interpreted was that all the galaxies are actually receding from earth. Indeed, the observed redshift was precisely what Lemaître had predicted. So it was true, our universe had a beginning and was expanding at every point.

However, back in 1917 Einstein proposed an extension of general relativity which would remedy the problem of an expanding universe (Einstein 1917). From this modified version, a static and unchanging universe could emerge. Einstein introduced a scalar quantity \(\varLambda \), called the cosmological constant, into his field equationFootnote 10

$$\begin{aligned} G^{\mu \nu } + \varLambda g^{\mu \nu }= - \frac{8 \pi G}{c^4} T^{\mu \nu }. \end{aligned}$$
(10.11)

This simple tweaking of the formalism had deep consequences. For one, the left-hand side of the field equations is not zero anymore in flat space-time, implying a curvature of empty space. When the experimental verification of the expansion of the universe was established, Einstein repudiated the cosmological constant and called it “the biggest blunder of my life” (quoted in Freedman 2004, p. 10). However, the cosmological constant, like a genie let out of a bottle, refused to disappear.

Today, a modern interpretation of (10.11) is

$$\begin{aligned} G^{\mu \nu } = - \frac{8 \pi G}{c^4} \left( T^{\mu \nu } + \frac{c^4 \varLambda }{8 \pi G} g^{\mu \nu } \right) = - \frac{8 \pi G}{c^4} \left( T^{\mu \nu } + T_{\text {vac}}^{\mu \nu } \right) . \end{aligned}$$
(10.12)

By moving the cosmological constant to the right-hand side of the field equation, it can be reinterpreted as the energy-momentum tensor of the vacuum

$$\begin{aligned} T^{\mu \nu }_{\text {vac}} = \frac{c^4 \varLambda }{8 \pi G} g^{\mu \nu } = c^2 \rho _\varLambda g^{\mu \nu }. \end{aligned}$$
(10.13)

Seemingly out of nowhere, an energy density of the vacuum emerges, driven by the cosmological constant

$$\begin{aligned} \rho _\varLambda = \frac{c^2 \varLambda }{8 \pi G}. \end{aligned}$$
(10.14)

Associated with this energy density is a peculiar negative-pressure equation of state

$$\begin{aligned} p_\varLambda = - c^2 \rho _\varLambda . \end{aligned}$$
(10.15)

This implies that in the expanding universe this negative pressure produces an amount of work.Footnote 11 As a counterintuitive result, the energy density of the vacuum does not decrease as the universe expands, but remains constant. See Misner et al. (1973), Peacock (1999), Rugh and Zinkernagel (2002).

This whole exercise may appear rather ad hoc and unpersuasive. However, a positive cosmological constant, tied to a non-zero vacuum energy, accelerates the expansion of the universe (Carroll 2001). The older the universe is, the faster its fabric is exploding. In 1998, this aspect of our universe was discovered (Perlmutter et al. 1998), leading to a Nobel prize being awarded in 2011. In a strange turn of events, a theory was modified to account for a belief and this modification unexpectedly then led to one of the profoundest predictions in cosmology. With the discovery of the accelerated expansion of the universe, an eighty-one-year-old chapter closes. Unfortunately, it is followed by a new chapter fraught with more puzzles (Sect. 10.3.1).

To end this section, it remains to be said that the vacuum energy can be calculated from (10.14) by employing the estimated value of the cosmological constant. Recently, the Planck Collaboration, a big science undertaking, presented the newest estimates for the cosmological parameters (Planck Collaboration et al. 2016). They measured the Hubble “constant” to be

$$\begin{aligned} H_0 \approx 67.74 \;[\text {km} /\text {s}^{} \text {Mpc}^{}]. \end{aligned}$$
(10.16)

The ratio between the vacuum energy and the critical densityFootnote 12 is found to be

$$\begin{aligned} \varOmega _\varLambda = \frac{\rho _\varLambda }{\rho _{\text {crit}}} \approx 0.6911. \end{aligned}$$
(10.17)

From these two values the cosmological constant can be computed as

$$\begin{aligned} \varLambda = \frac{3}{c^2} H_0^2 \varOmega _\varLambda \approx 1.11\times 10^{-52} \; [\text {m}^{-2}]. \end{aligned}$$
(10.18)

Putting this value into (10.14) uncovers the energy density of the vacuum

$$\begin{aligned} \rho _\varLambda \approx 5.95 \times 10^{-27} \; [\text {kg} /\text {m}^{3}] \approx 5.35 \times 10^{-9} \; [\text {erg} /\text {cm}^{3}]. \end{aligned}$$
(10.19)

The last approximation is retrieved by noting that \(1\; [\text {kg}] \approx 8.99 \times 10^{23} \;[\text {erg}]\). Comparing \(\bar{\rho }_{\text {qft}}\) from (10.3) with \(\rho _\varLambda \) reveals the true extent of the incompatibility—or better, the complete failing of quantum field theory to yield a sensible answer. However, in defense of quantum field theory, making this misguided calculation appear even more puzzling, comes the Casimir effect (Casimir and Polder 1948). It was postulated that there should exist a bulk effect of the virtual particles on the vacuum. Specifically, the idea was that it should be possible to reduce the vacuum energy between two conducting plates brought very closely together, resulting in a pressure difference which would exert a force. This quantum field theory effect could be measured, albeit decades later (Lamoreaux 1997). The status of zero-point energy in quantum field theory is thus highly ambiguous. Indeed (Peacock 1999, p. 184):

So, far from resolving the conceptual problems about vacuum energy, the Casimir effect merely muddies the waters. [...] In this respect, it it illustrates well the general philosophy of quantum field theory, which has been to sweep the big conceptual difficulties under the carpet and get on with calculating things.

Recall the rallying cry “Shut up and calculate!” from Sect. 2.2.1.

For further reading on the strange physics of nothingness, the vacuum, and voids, see, for instance Genz (1999), Barrow (2000), Close (2009), Weatherall (2016).

2 Quantum Gravity: The Cutting-Edge of Theoretical Physics

At first, the irreconcilable tension between the forces of gravity and the remaining three quantum forces was subtle. As so often in the history of physics, nature challenged the human mind with puzzles and paradoxes, only to ignite ingenuity and spark creativity. This time, however, the mind did not succeed in overcoming the obstacles. Nature was persistent and refused to reveal this most fundamental enigma. We appear to be stuck with two categorically incompatible theories of reality, describing the vast cosmos (general relativity introduced above) and the very small (the quantum field theories discussed above, unified in the standard model, Sect. 4.4). Each theory represents an immensely powerful predicting mechanisms, but both miss a fundamental ingredient. At their point of contact, they fail spectacularly, plunging theoretical physics into oblivion. In an effort to figure out what is going on, physicists have resorted to radical measures and have invoked extraordinary and exotic ontologies for reality. In summary (Callender and Huggett 2001, back cover):

The greates challenge in fundamental physics is how quantum mechanics and general relativity can be reconciled in a theory of “quantum gravity”. The project suggests a profound revision of our notions of space, time and matter, and so has become a key topic of debate and collaboration between physicists and philosophers.

2.1 Simple Quantum Gravity

General relativity and quantum field theory tell two very different stories when it comes to gravity. In the abstract formalism Einstein revealed, gravity does not exist as a force anymore. It is simply an effect of the warping and twisting of the space-time continuum due to matter. In quantum field theory, the forces are mediated via virtual quantum particles (Fig. 4.1). An example is seen in the Feynman diagram in Fig.  10.1. As a consequence, if we want to quantize gravity, then there should exist a corresponding force-carrying gauge boson called the graviton. The heart of the conceptual problem is the following (Giulini et al. 2003, p. v):

On one side, quantum theory, in its usual formulation and orthodox interpretation, requires an ambient non-dynamical spacetime. On the other side, gravity, as described by general relativity, requires a dynamical geometry of spacetime which is coupled to all material processes within. This implies that at least one of these theories cannot be fundamentally correct.

How can a physical theory be spectacularly accurate in its predictions and, at the same time, be fundamentally incorrect?

Even more troubling, the role time plays in both theories is also incompatible. In quantum mechanics, time is an absolute external element, whereas in general relativity time is an elementary part of the dynamic space-time continuum. In technical words, quantum mechanics is background-dependent while general relativity is background-independent. The first attempt at a theory of quantum gravity resulted in the Wheeler-DeWitt equation (DeWitt 1967; Wheeler 1968). In essence, it is a wave function of space. Unfortunately, the equation was riddled with problems. Foremost, time appears to be lost. In detail, this quantum gravity equation is independent of the time parameter. But how then can the evolution of something happening in time be calculated? Indeed, time represents a deep problem lurking at the foundations of reality (see Sect.  below).

Tinkering with the equations of quantum gravity, many angles of attack have been proposed. For instance

  • Alain Connes’ noncommutative geometry (Connes 1994).

  • Roger Penrose’s twistor theory (Penrose and MacCallum 1973).

  • Topological quantum field theory (Smolin 1995b).

However, two main approaches stand out. One begins with quantum field theory and adds gravity.Footnote 13 The other starts with general relativity and then adds quantum properties. The former attempt has received a tremendous amount of publicity under the name of string theory. Indeed, in the theoretical physics community it was touted as the “only game in town.” The latter approach to quantum gravity is known as loop quantum gravity. Today, these two theories are the most promising hopes of merging general relativity with quantum mechanics (Smolin 2001). For a general overview of the history of quantum gravity, see Rovelli (1998, 2002).

2.2 String/M-Theory

The colorful, surprising, and sometimes haphazard history of string theory, ultimately culminating in M-theory, was described in Sect. 4.3.2. The accidental discovery of superstrings resulted in one of the most creative outbursts in theoretical physics. To illustrate, between 1999 and 2008, roughly 800–900 scientific papers were published on the subject each year, totaling over 8,000 contributions (Bradlyn 2009). However, string theory’s popularity can also be attributed to fashion rather than solely being justified as an inevitable necessity. This is in the spirit of the philosophers of science Thomas Kuhn (Sect. 9.1.3) and Paul Feyerabend (Sect. 9.1.6), who identified an element of irrationality in the evolution of science. The science writer Gary Taubes recalls an encounter with the theoretical physicist Alvaro de Rujula (quoted in Woit 2006, p. 222):

On August 4, 1985, I sat in the cantina at CERN drinking beer with Alvaro de Rujula. [...] De Rujula predicted that 90% of the theorists would work on superstrings [...] because it was fashionable.

As a result there was also a perceived lack of options for theorists. In the words of the Nobel laureate David Gross, one of the founders of string theory, in 1987 (quoted in Woit 2006, p. 221):

So I think the real reason why people have gotten attracted by it [string theory] is because there is no other game in town.

In the words of Joseph Polchinski, another string theory pioneer (quoted in Penrose 2004, p. 892):

[A]ll good ideas are part of string theory.

In the wake of this pursuit of quantum gravity, a lot of abstract mathematical machinery was conceived of Hatfield (1992), Duff (1999), Kaku (2000), Polchinski (2005a, b), Green et al. (2012a, b), Rickles (2014). Indeed, string/M-theory is responsible for producing entirely new and esoteric branches of mathematics (Sect. 2.1.4). However, the mathematical machinery is constrained by some very specific requirements for it to be consistent. If these formal constraints are translated into reality, the universe we inhabit possesses some very remarkable properties. In other words, string/M-theory invokes a radically new ontology. Crucially, the formalism relies on the existence of

  • supersymmetry;

  • higher-dimensional space-time.

Supersymmetry is an elegant novel symmetry relating the matter particles (fermions) to the force carrying particles (bosons) . It is a powerful tool, unlocking many abstract abilities (Sect. 4.3.2). However, it comes with a hefty price, as it requires the number of existing particles to be doubled—each matter fermion and gauge boson must have its supersymmetric partner. In effect, supersymmetry conjures up a mirror world to the particles listed in Fig. 4.1. Higher-dimensional physics has a pre-string theory origin (Sect. 4.3.1). In the context of M-theory, space is a colossal ten-dimensional structure, weaving an eleven-dimensional space-time fabric we supposedly inhabit. The extra dimensions we cannot observe are rendered invisible as they “wrap” upon themselves. In technical parlance, the additional spacial dimensions are compactified on special geometries called Calabi-Yau manifolds (Sect. 4.3.2). Alas, the LHC still refuses to produce any shred of experimental evidence for this new kind of physics.

But what about the predictive power of this abstract formalism? What novel physics is associated with this impressive mathematical behemoth? Returning to the notion of the vacuum, string/M-theory has much to say—too much. In a nutshell, the process of retrieving our four-dimensional universe from the eleven-dimensional M-theory template via compactification allows for a lot of freedom. Our universe, specifically the vacuum of our universe, is just one possible state in a vast landscape of possible vacua (Susskind 2007). Indeed, estimates suggest that there exist an inconceivable \(10^{500}\) such vacua (Douglas 2003; Tetteh-Lartey 2007). In comparison, there are an estimated \(10^{80}\) atoms in the entire universe. So one wonders (Woit 2006, p. 239):

The possible existence of, say, \(10^{500}\) consistent different vacuum states for superstring theory probably destroys the hope of using the theory to predict anything.

However, string theorists are not easily discouraged (Woit 2006, p. 239):

In recent years, [Leonard] Susskind, one of the codiscoverers of string theory, has begun to argue that this ability of the theory to be consistent with just about anything should actually be thought of as a virtue.

See Susskind (2006) for Susskind’s thoughts on this.

The theoretical physicist Woit, a staunch critic of string theory,Footnote 14 as can be guessed from the excerpts quoted above from his book on string theory called Not Even WrongFootnote 15 (Woit 2006), continues his negative assessment (Brockman 2015, p. 70f.):

For anyone currently thinking about fundamental physics, this latest Edge questionFootnote 16 is easy, with an obvious answer: string theory. The idea of unifying physics by positing strings moving in ten space-time dimensions as fundamental entities was born in 1974, and became the dominant paradigm for unification from 1984 on. After 40 years of research and literally tens of thousands of papers, what we’ve learned is that this is an empty idea. It predicts nothing about anything, since one can get pretty much any physics one wants by appropriately choosing how to make six of the ten dimensions invisible.

According to string theorists, we live in an obscure corner of a multiverse where anything goes, and this “anything goes” fits right in with string theory, so fundamental physics has reached its end-point.

The observation at the LHC of the Higgs, but no superpartners, has caused great consternation among theorists. Something has happened that should not have been possible according to the forty-year-old reasoning now well-embedded in textbooks.

Others chimed in as well, like the mathematical physicist Frank Tipler (Brockman 2015, p. 68.):

As it was in the beginning of modern science, so it should be now. We should keep the fundamental requirement that experimental confirmation is the hallmark of true science. Since string theorists have failed to propose any way to confirm string theory experimentally, string theory should be retired, today, now.

Indeed, the attempts to justify string/M-theory based on non-empirical arguments, for instance, Dawid (2013), have been met with grave concerns (Ellis and Silk 2014; Rovelli 2016). The cosmologist Sean Carroll continues the skeptical assessment (quoted in Cole 2016):

Answering deep questions about quantum gravity has not really happened. They have all these hammers and they go looking for nails. That’s fine. But it isn’t fine if you forget that, ultimately, your goal is describing the real world.

Finally, the string pioneer Gross again (quoted in Cole 2016):

There was a hope. A moment. We even thought for a while in the mid-’80s that it [string theory] was a unique theory. After a certain point in the early ’90s, people gave up on trying to connect to the real world. The last 20 years have really been a great extension of theoretical tools, but [with] very little progress on understanding what’s actually out there.

Today, string theory has taken on a life of its own. In the words of the mathematical physicist Robbert Dijkgraaf, “things have gotten almost postmodern” (quoted in Cole 2016). Although it has not emerged as the promised theory of quantum gravity, string theory remains a useful formal tool in theoretical physics and mathematics.

There has been a lot of bitterness and rancor between the supporters and skeptics of string/M-theory. Counterbalancing the flood of publications is a growing body of literature not only questioning the validity of string/M-theory—and its inability to produce any foreseeable prediction—but also modern theoretical physics as a whole Woit (2006), Smolin (2007), Baggott (2013), Unzicker and Jones (2013), Hossenfelder (2018). We are again reminded of the end of science (Sect. 9.2.2). Naturally, such criticism was faced with fierce opposition. Woit, describing the reaction of two string theory graduates to some of his criticism, reports (Woit 2006, p. 223):

[They] were of the opinion that I was an incompetent idiot threatening to hold back the progress of science.

Perhaps the most vocal, unapologetic, and aggressive defender of string theory is Luboš Motl. Unknown and isolated, he was a young undergraduate physics student in the Czech Republic. In 1996, Motl uploaded a string theory paper to an online scientific archive for preprints, called the arXiv (Motl 1996). While submissions to the archive are not considered to be scientific publications, as they are not peer reviewed, the arXiv enjoys huge popularity. Motl’s submission impressed established string theorists and he ended up with a scholarship to Rutgers, where he graduated. The next step in this amazing career was an assistant professorship at Harvard University, starting in 2004. See Glanz (2001). In 2007, his stellar rise came to a premature end. He left Harvard and returned to the Czech Republic and has not published a single piece of research since. He has, however, become a prolific blogger.Footnote 17 Motl’s blog, which he calls the “supersymmetric world from a conservative viewpoint,” is a platform for his political activism, climate change skepticism, and criticism of anything he perceives as anti-string theory. The following is an account of the theoretical physicist and quantum gravity researcher Sabine Hossenfelder, author of Hossenfelder (2018), relating to her interactions with Motl in 2007. She writes on her blog:Footnote 18

Luboš has repeatedly insulted me, my husband and my friends. He has misquoted me, and used alleged quotations of mine to insult others. He has an incredible amount of times accused me of having said things I never said, only to then explain, based on this, that I am “stupid”, “silly”, and “a crackpot” with “crackpot friends”. He is in no way interested in understanding my opinion, or my point of view. He has proclaimed I should not have a Ph.D., that my “female brain” only “parrots nonsense” and all my papers are “bullshit”—the latter evidently without having read them. He has treated others the same way previously, and will probably proceed doing so.

As to present date he has made a habit out of producing distorted echos of my posts or comments at other people’s blogs. He never acknowledges discussions we have had earlier, which he usually ends with retreating to insults when he runs out of arguments. Luboš Motl either is indeed as unable to understand other people’s opinions as he pretends, or he chooses to do so deliberately.

Such animosity is not an isolated case. An example of a Motlesque online attack is the followingFootnote 19:

I must tell you that before 2006, everyone would agree that [Lee] SmolinFootnote 20 was a crank and Woit was an irrelevant grumpy guy outside whose importance for physics was exactly zero.

[...]

Lee Smolin, a far-left radical and a former (and current?) hippie, has also brought an extremely thick layer of politically correct victimism to the field.

Perhaps such antics were responsible for the unfortunate and abrupt end of Motl’s budding science career. On a side note, he defended the Bogdanov brothers in what is known as the Bogdanov affair (Sect. 9.1.4).

However, perhaps the most fruitful criticism of string/M-theory comes from the proponents of loop quantum gravity. After all, they are claiming to solve the conundrum of quantum gravity with very different tools.

2.3 Loop Quantum Gravity

In the history of quantum gravity the formal approach known as loop quantum gravity played a subordinate role. Naturally, as there was conceived to be only “one game in town.” In its roots, loop quantum gravity extends the classical theory of general relativity. One crucial ingredient was supplied by the mathematical physicist and cosmologist Penrose in the 1970s, called spin networksFootnote 21 (Penrose 1971). These networks represent quantum states of particles and their interactions. More technically, a spin network is a graph carrying labels, related to representations of symmetry groups (Sect. 3.1.4), on its links and nodes. 24 years later, this idea surprisingly re-emerged, as spin networks were found to represent the states of loop quantum gravity (Rovelli and Smolin 1995b; Baez 1995). During those years, a key insight was Ashtekar (1987), building on Sen (1982). In essence, the foundation of this new theory of quantum gravity was laid, based on the notion of quantum geometry, i.e., quantum space-time (Rovelli and Smolin 1995a; Loll 1995).

This can be seen as the first fundamental proposition for a new ontology of reality. Space itself is now finite, composed of discrete, quantized “atoms.” In effect, there exists a lower limit to the resolution of the universe as there are no arbitrarily small chunks of space. Similarly to the way quantum theory constrained reality to be comprised of finite quanta of energy, loop quantum gravity posits the discrete nature of space itself. Mathematically, the area (volume) of a given physically defined surface (spatial region) is expressed as an operator which has a discrete spectrum of eigenvalues. However, in such a world, the origin of this finite structure of reality becomes a question. More generally, why aren’t space and energy states continuous and why is the speed of light finite? Recall the tension between the discrete (Sect. 5.3.2) and the continuous (Sect. 5.3.1) discussed in Chap. 5—in essence, the discrepancy between the finite and infinite in the formal thought systems of the mind. Indeed, in the categories of human knowledge generation, seen in Fig. 5.9, the spin networks of loop quantum gravity can be attributed to the fundamental-algorithmic demarcation, in contrast to the fundamental-analytical classification of the rest of the edifice of physics (Sect. 5.4.1). Perhaps this venture into the domain of formal discreteness has the power to unveil some desperately needed new insights.

The development of loop quantum gravity continued and many of the challenges were met (Thiemann 1996). The evolution of a spin network is described by what is called a spin foam and yields the dynamics of the theory (Reisenberger and Rovelli 1997; Barrett and Crane 1998). The Bekenstein-Hawking black hole entropy (Bekenstein 1973; Hawking 1974) is computed within loop quantum gravity (Smolin 1995a; Rovelli 1996) as well as within string theory (Strominger and Vafa 1996), almost at the same time. This is discussed in the context of the holographic principle in Sect. 13.4.1. In a nutshell, loop quantum gravity is a proposed theory of quantum gravity —carrying much less conceptual baggage compared to string theory—characterized as being non-perturbative, background-independent, and diffeomorphism invariant. The last property is related to the principle of covariance in general relativity (Sect. 4.1). A well defined version of the Wheeler-DeWitt equations was successfully found with loop quantum gravity (Jacobson and Smolin 1988).

In the simplest of terms, the reality in string theory—albeit being supersymmetric and higher-dimensional—is made of tiny vibrating strings, explaining all observable phenomena. In contrast, loop quantum gravity is concerned with the quantum properties of space-time itself, its structure being a fine fabric woven out of finite loops. Both approaches had long been thought to be incompatible with each other. Now some theorists are expressing doubts and are suggesting similarities (Gambini and Pullin 2014; Cartwright 2017). Indeed, loop quantum gravity has been expressed in higher dimensions incorporating supersymmetry (Bodendorfer et al. 2013). However, at the end of the day, any theory of quantum gravity needs to be empirically validated. Until then, we are left with the words of the mathematician Eric R. Weinstein (Brockman 2015, p. 60):

[I]t is hard to find a better candidate for an intellectual bubble than that which has formed around the quest for a consistent Theory of Everything physical, reinterpreted as if it were synonymous with “quantum gravity.” If nature were trying to send a polite message that there is other preliminary work to be done first before we quantize gravity, it is hard to see how she could send a clearer message than dashing the Nobel dreams for two successive generations of Bohr’s brilliant descendants.

For further reading—technical and non-technical—on loop quantum gravity, see, for instance Smolin (2001), Baez (2000), Thiemann (2006, 2007), Rovelli (2008), Chiou (2015), Rovelli (2017). Finally, an insightful book by Smolin, arguing for an evolutionary angle of attack on cosmology and existence, called The Life of the Cosmos (Smolin 1997).

3 The Large and the Small

In the last sections, much of the focus of the discussion has been placed on the nature and structure of physical theories, from quantum field theory to quantum gravity. However, the question remains: What is the true nature of reality? What do we know about reality’s ontology? One way to address this issue is to analyze how the universe structures itself at very small and very large scales.

3.1 Cosmological Conundrums

Building on the field equations of general relativity (10.6), a lot of effort has been made to find exact solutions. These solutions tell us about the organizing principles of the cosmos. The Friedmann–Lemaître–Robertson–Walker metric is such an exact solution, describing a homogeneous, isotropic, and expanding (or contracting) universe (Friedman 1922; Lemaître 1927; Robertson 1935; Walker 1937). The result of inserting this specific metric into Einstein’s equations is a set of differential equations, called Friedmann’s equations. These equations reveal the astonishing fact that there exists a direct connection between the matter density of the universe and its global geometry. In detail, this is expressed by the critical density

$$\begin{aligned} \rho _{\text {c}} = \frac{3 H^2}{8 \pi G}, \end{aligned}$$
(10.20)

where H is the Hubble parameter and G Newton’s gravitational constant. A universe with a matter density above this value will be spatially closed, while a lower-density universe will be spatially open. In a two-dimensional toy universe, a sphere is an example of a closed geometry, while a saddle point represents an open one. At the critical density, this two-dimensional model universe would be a flat sheet. As a consequence, \(\rho _{\text {crit}}\) is the parameter which determines if our universe is static or not. A larger matter density\(\rho _{\text {m}}\) will eventually lead to a collapsing universe, whereas a smaller value will result in a forever expanding universe. To capture this behavior, the variable \(\varOmega _{\text {m}}\) is introduced, as the ratio between the matter density and the critical density

$$\begin{aligned} \varOmega _{\text {m}} = \frac{\rho _{\text {m}}}{\rho _{\text {crit}}} = \frac{8 \pi G \rho }{3 H^2}. \end{aligned}$$
(10.21)

Now \(\varOmega _{\text {m}} =1\) represents a universe with a matter density such that it is static. However, from Sect.  we know that empty space also has an energy density. As a consequence, the total density of the universe is determined by two contributions, related to (10.21) and (10.17).

$$\begin{aligned} \rho _{\text {total}}=\rho _{\text {m}}+\rho _\varLambda . \end{aligned}$$
(10.22)

Recent measurements from Planck Collaboration et al. (2016) have established that

$$\begin{aligned} \varOmega _{\text {m}} = \frac{\rho _{\text {m}}}{\rho _{\text {crit}}} = 0.3089 \pm 0.0062, \qquad \varOmega _\varLambda = \frac{\rho _\varLambda }{\rho _{\text {crit}}} = 0.6911 \pm 0.0062. \quad \end{aligned}$$
(10.23)

In other words, the content of our universe is comprised of \(30.89\%\) matter and \(69.11\%\) vacuum energy.Footnote 22

Regrettably, no one knows what the true origin and nature of this vacuum energy density is. It is labeled dark energy and there exist many competing explanations for it. The simplest comes from introducing the cosmological constant (Sect. ). However, this raises the issue about the fundamental struggle to construct a theory of quantum gravity (Sect. 10.1). Another proposed solution is called quintessence, where a time-varying Higgs-like fieldFootnote 23 is responsible for the emergence of dark energy (Caldwell et al. 1998). Others have argued that dark energy does not actually exist and that it is simply a measurement artifact (Mattsson 2010). Finally, recent observations that the universe’s accelerated expansion appears to be faster than assumed do not help (Castelvecchi 2016; Amit 2017). Once again, we are reminded of the boundaries of our knowledge. An anomaly persists, which no one knows how to address. Yet again, we are left in the dark when it comes to the ontology of reality. And the situation gets worse.

Perhaps one of the most pressing and fundamental challenges in cosmology is the following. There exist two possible methods by which we can observe the structure of the universe. One is related to measurements of the electromagnetic radiation reaching Earth, the other is a consequence of the effects of gravity itself. The problem is that (Peacock 1999, p. 353):

In an ideal world, these two routes [...] would coincide; in practice, the gravitational route is able to detect more mass by a factor of up to ten than can be detected in any other way.

This is a spectacularly setback. Our observations of the cosmos are incomplete or are at odds with each other. Here is where dark matter comes in Zwicky (1933), Rubin et al. (1980). It is a theorized form of matter that is believed to account for this discrepancy. However, a crucial problem is that no one knows what this type of matter is made of. It cannot be ordinary matter, i.e., it must be non-baryonic matter.Footnote 24 Recent measurements have established, that the \(30.89\%\) matter content of the universe is made up of only about \(4.86\%\) ordinary matter and \(26.03\%\) is due to dark matter (Planck Collaboration et al. 2016).Footnote 25 In other words, of all the matter in the universe, approximately \(84.26\%\) is unaccounted for.

In summary, the matter-energy content of the whole universe is comprised of ordinary matter, dark matter, and dark energy

$$\begin{aligned} \varOmega _{\text {m}}\, +\,&\varOmega _{\text {dm}} + + \varOmega _{\text {de}} = 1, \end{aligned}$$
(10.24)
$$\begin{aligned} 0.0486 \,+\,&0.2603 + 0.6911 \approx 1. \end{aligned}$$
(10.25)

To conclude, a staggering \(95.14\%\) of all that exists in the universe is unknown to us. We can only detect indirect traces of it. From a philosophical perspective, this represents a cataclysmic turn of events. Everything the human mind has ever directly perceived is only a tiny slice of reality.

Despite this profound ignorance, and upping the ante, there are hints which speak of a privileged status of life on Earth. In the introduction to this chapter, the “axis of evil” was mentioned. This is an anomaly in the cosmic microwave background radiation which appears to give special significance to the location of Earth within the entire universe (Cho 2007). Naturally, most researchers understand this to be a statistical fluke. However, joining this spatial fluke is a temporal one. It is called the coincidence problem (Velten et al. 2014):

The observational fact that the present values of the densities of dark energy and dark matter are of the same order of magnitude, \(\rho _{\text {de}}/\rho _{\text {dm}} \sim \mathcal {O}(1)\), seems to indicate that we are currently living in a very special period of the cosmic history. Within the standard model, a density ratio of the order of one just at the present epoch can be seen as coincidental since it requires very special initial conditions in the early Universe. The corresponding “why now” question constitutes the cosmological “coincidence problem” .

Given such bizarre coincidences, it is very tempting to console oneself with the Anthropic Principle (Sect. 15.2). The universe happens to be perfectly fine-tuned in such a way, that it not only allows for conscious life to emerge, but necessitates all the coincidences in the cosmic evolution that we can identify. After all, if this were not the case, no one would be wondering about them in the first place.

3.2 The Weird Quantum Realm of Reality

If the universe appears incomprehensible at large scales, then at small scales it truly transcends any meaning—all our human conceptuality threatens to fail. Our commonsense intuitions about reality, built on observing the world from a human perspective, are jeopardized. Even Feynman, despite his spectacular success in devising a mathematical formalism accurately describing quantum phenomena, confessed (Feynman 1967, p. 129):

I think I can safely say that nobody understands quantum mechanics.

He then goes on to say (Feynman 1967, p. 129):

I am going to tell you what nature behaves like. If you will simply admit that maybe she does behave like this, you will find her a delightful, entrancing thing. Do not keep saying to yourself, if you can possibly avoid it, “But how can it be like that?” because you will get “down the drain”, into a blind alley from which nobody has yet escaped. Nobody knows how it can be like that.

In 1901, Max Planck stumbled upon the quantum realm of reality by chance (Sect. 4.3.3). Indeed, his radical postulation of the existence of discrete quanta, giving birth to quantum physics, was an act of despair: “I was ready to sacrifice any of my previous convictions about physics” (quoted in Longair 2003, p. 339). Until that day in 1901, eminent physicists had began to foresee the end of physics, as apparently everything about reality was understood (Sect.  9.2.2). This accidental discovery opened up Pandora’s box of philosophical conundrums. The philosopher Ernst von Glasersfeld, who coined the term radical constructivism,Footnote 26 observed (quoted in Schülein and Reitze 2002, p. 175, translation mine):

Modern physics has conquered domains that display an ontology that cannot be coherently captured or understood by human reasoning.

Even Niels Bohr, one of the founding fathers of quantum mechanics, admitted (quoted in Sundermeyer 2014, p. 168):

If quantum mechanics hasn’t profoundly shocked you, you haven’t understood it yet.

In a nutshell, quantum physics confronts us with epistemic and ontic enigmas:

  1. 1.

    Reality, for the first time, revealed a discrete and finite structure.

  2. 2.

    The foundations of reality are inherently probabilistic.

  3. 3.

    The on-off dichotomy of binary logic is transcended.

  4. 4.

    The act of measuring a quantum property affects the quantum property.

  5. 5.

    There is a fundamental limit to the knowledge which nature is willing to reveal.

  6. 6.

    At a fundamental level, the local realism of classical reality cannot be upheld.

However, once these weird properties are formalized and re-expressed mathematically, there is no stopping the success of quantum mechanics.Footnote 27 We can translate the above list into the language of physics and thus sidestep the philosophical interpretationsFootnote 28:

  1. 1.

    Quanta —the smallest energy scale of particles (Feynman et al. 1965; Sakurai 1994; Messiah 2000).

  2. 2.

    Probability amplitudes and Schrödinger’s wave equation (3.24).

  3. 3.

    Wave-particle duality and the superposition of quantum states (Feynman et al. 1965; Sakurai 1994; Messiah 2000).

  4. 4.

    The collapse of the wave function —if it collapses at all (Feynman et al. 1965; Sakurai 1994; Messiah 2000).

  5. 5.

    Heisenberg’s uncertainty principle (Sect. ).

  6. 6.

    Bell’s theorem and entanglement (the focus of this section).

Naturally, there exists a vast body of literature on quantum physics, including layman’s guides and a plethora of esoteric interpretations, grappling with these notions.

However, one of the most surprising properties of the quantum realm is perhaps the phenomena of entanglement, related to the uncanny foundation of reality. In essence, quantum mechanics destroys the notion of local realism. This is the merger of two commonsensical and tried assumption:

  1. 1.

    Locality: No signal can travel faster than the speed of light, as postulated by special relativity (this is related to causality as seen in Sect. 3.2.1), and objects are only directly influenced by their local surroundings.

  2. 2.

    Realism: Nature exists independently of the human mind. Specifically, measurable properties of a physical system exist prior to their observation.

The rejection of local realism and its consequences opens a colorful chapter in the history of physics.

3.2.1 Entanglement: From Einstein to the Hippies

Einstein famously opposed quantum physics. He did not trust the probabilistic foundation of the theory. This is ironic, as he was instrumental in the creation of the theory (Sect. 4.3.4). Einstein, together with two junior colleagues, devised an ingenious thought experiment, which would expose the inadequacy of quantum mechanics for all to see. The Einstein-Podolsky-Rosen (EPR) paradox was born. The idea demonstrates that the formal tools utilized by the theory do not provide a complete description of physical reality. In detail, quantum mechanics appears to allow for the instantaneous transmission of information, potentially violating special relativity. Einstein called this disapprovingly “spooky action at a distance” (quoted in Kaiser 2011, p. 30). In effect, the team had inadvertently discovered the possibility of correlated quantum states or entanglement (Einstein et al. 1935).

Nine years passed. Einstein, who died in 1955, spent the last two decades of his life obsessed with developing a unified field theory (Sect. 4.3.5). Then, in 1964, the physicist John Stewart Bell presented groundbreaking work on the EPR paradox. Bell’s theorem places a constraint on quantum mechanics (Bell 1964). By assuming local realism, Bell could derive and prove a set of inequalities. He then went on to demonstrate how specific cases thereof where violated by actual quantum mechanical predictions. In effect, Bell’s theorem proved that any physical theory which incorporates local realism cannot reproduce the observable predictions of quantum mechanics. Shockingly, entanglement appeared to be an actual property of the quantum realm. Unsurprisingly, the theorem emerged as the core of the controversy surrounding the interpretation of quantum mechanics (Kosso 1998; Maudlin 2011; Becker 2018).

When a group of particles share spatial proximity, it can happen that the quantum states describing the individual particles merge and the whole system must now be described by a single quantum state. Each particle can no longer be described independently of the state of the other ones anymore. This property is called entanglement and persists independently of the spatial distributions of the system’s particles. As a result, the measurements of physical properties are correlated. In effect, measuring such a property of an entangled particle will instantaneously affect its entangled cousins—even if they are at the other end of the universe. Some mysterious structural connectivity glues entangled particles together, which appears to transcends space and time. Bell’s theorem, building on the EPR paradox designed to invalidate quantum mechanics, has been experimentally verified (Freedman and Clauser 1972; Aspect et al. 1981; Giustina et al. 2013; Gröblacher et al. 2007; Hensen et al. 2015; The BIG Bell Test Collaboration 2018). Entanglement has been experimentally observed for greater and greater distances (Aspelmeyer et al. 2003; Yin et al. 2012, 2017), from 600 m to 1,200 km. However, this whole matter has inadvertently escaped the secure grounding of physics and has ventured into philosophy. As can be expected, discussions abound and the implications are still being debated (Wiseman 2014).

Today, entanglement plays a central role in quantum information theory and quantum computation. Specifically, quantum encryption crucially depends on a fundamental insight, known as the no-cloning theorem. The discovery of this theorem was historically connected to the issues surrounding entanglement. The consequential no-cloning theorem could have, however, been lost to humanities’ collective mind, were it not for an eccentric group of physicists at Berkeley in the 1970s (Kaiser 2011, p. xxiiiff.):

The group of hippies who formed the Fundamental Fysiks Group saved physics in three ways. First concerned style and method. [...] More than most of their generation, they sought to recapture the big-picture search for meaning that had driven their heroes—Einstein, Bohr, Heisenberg, and Schrödinger [...] Second, members of the Fundamental Fysiks Group latched onto a topic, known as “Bell’s theorem,” and rescued it from a decade of unrelenting obscurity. [...] The hippie physicists’ concerted push on Bell’s theorem and quantum entanglement instigated major breakthroughs—the third way the saved physics.

Indeed, relating to the group’s first contribution, during those years “physicists who showed any interest in the foundations of quantum mechanics labored under a ‘stigma,’ as powerful and keenly felt as any wars on religion or McCarthy-like political purges” (Kaiser 2011, p. 46). Concerning the second contribution, at that time, one “could find few physicists who seemed to care” about Bell’s theorem from 1964. One of the charter members of the group, the Berkeley theoretical physicists Henry Stapp—a collaborator of Wolfgang Pauli, Werner Heisenberg, and John Wheeler—was “in all likelihood the first physicists in the United States to pay attention to Bell’s theorem” (Kaiser 2011, p 55). However (Kaiser 2011, p. xxv):

The most important [contribution of the Fundamental Fysiks Group] became known as the “no-cloning” theorem,” a new insight into quantum theory that emerged from spirited efforts to wrestle with hypothetical machines dreamed up by members of Fundamental Fysiks Group. Akin to Heisenberg’s famous uncertainty principle, the no-cloning theorem stipulates that it is impossible to produce perfect copies (or “clones”) of an unknown or arbitrary quantum state. Efforts to copy the fragile quantum state necessarily alter it.

Notably (Kaiser 2011, p. xxv):

Less well known is that the no-cloning theorem emerged directly from the Fundamental Fysiks Group’s tireless efforts—at once earnest and zany—to explore whether Bell’s theorem and quantum entanglement might unlock the secrets of mental telepathy and extrasensory perception, or even enable contact with spirits of the dead.

In a nutshell, members of the Fundamental Fysiks Group learned about Bell’s obscure theorem in 1967. Entranced by this vision of non-locality, John Clauser worked on devising an experiment to test the theorem. He later succeeded with a collaborator (Freedman and Clauser 1972). Now, with the certainty that entanglement exists, the group brainstormed about the implications. To them, a logical conclusion was the possibility of faster-than-light information transfer. A potential application was drafted, called the “superluminal telegraph” (Herbert 1975). A matured version appeared seven years later (Herbert 1982). When this proposed experiment, demonstrating superluminal effects, was published, many physicists believed “that it should work” (Kaiser 2011, p. 224). Others worked hard to discover a loophole in the argumentation. Indeed, this loophole unexpectedly turned out to be the no-cloning theorem (Wootters and Zurek 1982; Dieks 1982; Ghirardi and Weber 1983). In summary (Kaiser 2011, p. 196):

The all important no-cloning theorem was discovered at least three times, by physicists working independently of each other. But each discovery shared a common cause: one of Nick Herbert’s remarkable schemes for a superluminal telegraph.

The novel insight launched a major technological advance, as the no-cloning theorem lies “at the heart of today’s quantum encryption technology” (Kaiser 2011, p. 196). Indeed (Kaiser 2011, p. 196f.):

Little could [the members of the Fundamental Fysiks Group] and others know that their dogged pursuit of faster-than-light communication—and the subtle reason for its failure—would help launch a billion-dollar industry.

Remarkably (Kaiser 2011, p. xvii):

Despite the significance of quantum information science today, the Fundamental Fysiks Group’s contributions lie buried still, overlooked and forgotten in physicists’ collective consciousness. [...] Indeed, from today’s vantage point it may seem shocking that anything of lasting value could have come from the hothouse of psychedelic drugs, transcendental meditation, consciousness expansion, psychic mind-reading, and spiritualist séances in which several members dabbled with such evident glee. History can be funny that way.

Although, at the time, the hippie physicists did attract a lot of attention (Kaiser 2011, p. xxiif.):

The inherent tensions that historians have begun to identify within the hippie counterculture [...] help explain the wide range of followers whom the Fundamental Fysiks Group inspired. Their efforts attracted equally fervent support from stalwarts of the military-industrial complex as from storied cultivators of flower power [...].

For more on entanglement and the history of quantum mechanics, see Sect.  4.3.4.

3.2.2 The Interpretation of Quantum Mechanics

To this day, the interpretation of quantum mechanics is a hotly debated issue. In other words, there exists no consensus about the ontology this theory is telling us about. For most physicists, the interpretation of quantum mechanics clearly lies in the domain of philosophy and is thus irrelevant to the success of the mathematical formalism in decoding the workings of nature: “Shut up and calculate!” (Sect.  2.2.1). Perhaps this attitude is best captured by one of the founders of quantum field theory (Kaiser 2011, p. 111f.):

Despite his wide-ranging interests, Feynman had long been skeptical about philosophy. One of his many beloved anecdotes, told and retold later in life, centered on his frustration with a philosophy course through which he had suffered as an undergraduate. [...] the thorny matters of how to interpret the quantum formalism were all “in the nature of philosophical questions. They are not necessary for the further development of physics.”

Not everyone appears to agree. In the words of Einstein (quoted in Becker 2018, p. 288):

So many people today—and even professional scientists—seem to me like somebody who has seen thousands of trees but has never seen a forest. A knowledge of the historic and philosophical background gives that kind of independence from prejudices of his generation from which most scientists are suffering. This independence created by philosophical insight is—in my opinion—the mark of distinction between a mere artisan or specialist and a real seeker after truth.

In any case, the history of the mathematical formalism of quantum mechanics evolved in an orderly fashion:

  • Planck introduces quanta to explain black-body radiation (Planck 1901).

  • Einstein interprets light as being made up of quantized particles, called photons, winning him the Nobel prize (Einstein 1905).

  • Bohr computes the quantized orbits of electrons in hydrogen atoms (Bohr 1913).

  • Louis de Broglie presents his thesis arguing that particles are simultaneously waves and vice versa (De Broglie 1924).

  • Heisenberg devises the first mathematical description of quantum mechanics, called matrix mechanics (Heisenberg 1925).

  • Schrödinger rewrites de Broglie’s wave-particle duality in terms of probability amplitudes, called wave functions, and derives their wave equation (Schrödinger 1926a, b, c, d).

  • Paul Dirac introduces infinite-dimensional Hilbert spaces in which operators represent physical observables, uniting matrix mechanics with the mechanics of the wave functions (Dirac 1930).

In contrast, the conceptual understanding of the mathematical formalism and the assumptions about the true nature of the quantum reality has remained highly controversial—to this day. The phenomena of quantum physics are very reluctant to fit into any coherent ontological framework. It is, however, very clear that not everything in our classical worldview can be right. Furthermore, is the weirdness encountered in quantum physics epistemic or ontic? The main themes of the philosophical challenges presented by the quantum world relate to:

  • the tension between causal and probabilistic laws;

  • the status of determinism;

  • the interpretation of unobserved entities;

  • the issue of local realism.

As discussed above, the status of local realism—a world in which reality is independent from observation and no faster-than-light signals exist—has taken a heavy toll. Indeed, it is not even a matter of choosing which attribute to believe in Gröblacher et al. (2007):

Our result suggests that giving up the concept of locality is not sufficient to be consistent with quantum experiments, unless certain intuitive features of realism are abandoned.

To make matters worse, the exact level of entanglement appears fine-tuned (Clark 2017):

There’s nothing stopping the quantum world having different levels of underlying correlation—largely uncorrelated worlds are possible within the broad sweep of the theory, as are ones that are far more connected. But only a universe with the exact level of weirdness that corresponds to entanglement produces the rich tapestry of phenomena, including life, that ours does.

Finally, the Kochen-Specker theorem highlights another subtlety of the quantum world (Bell 1966; Kochen and Specker 1967). It is related to Bell’s famous theorem. In effect, quantum mechanics logically forces one to renounce one of the three following assumptions (Held 2018):

  1. 1.

    All observables defined for a [quantum mechanical] system have definite values at all times.

  2. 2.

    If a [quantum mechanical] system possesses a property (value of an observable), then it does so independently of any measurement context, i.e. independently of how that value is eventually measured.

  3. 3.

    There is a one-one correspondence between properties of a quantum system and projection operators on the system’s Hilbert space.

The last assumption is, of course, the cornerstone of the mathematical formalism of quantum mechanics.

In 2011, a survey was taken at a conference on Quantum Physics and the Nature of Reality in Austria. Thirty-three participants—all experts on the matter—answered various questions (Schlosshauer et al. 2013). “Do you believe that physical objects have their properties well defined prior to and independent of measurement?” resulted in roughly a 50/50 split of opinions. 64% believed that randomness is a fundamental concept in nature. Relating to the measurement problem, opinions diverged. Some believed it was a pseudoproblem, others thought it was solved, while again others perceived it as a threat to quantum mechanics. To the question “What is your favorite interpretation of quantum mechanics?” 42% of respondents answered with “the Copenhagen interpretation,” representing the most popular choice. This was the first attempt at an orthodox interpretation, going back to Bohr in the late 1920s. One hallmark is that realism is abandoned. When unobserved, reality exists in a state of indeterminacy—things exist in a spooky superposition of possible states. It is as if reality is comprised of ghost worlds interacting with each other. By observing reality, i.e., by measuring properties, this possibility space collapses into a single reality we can observe. Mathematically speaking, the probabilistic wave function, encoding the superposition of states, collapses and a definite reality is observed, in accordance with our classical world (Omnès 1994; Torretti 1999). This marks the transition between the quantum and the classical realms of reality. Puzzled by these notions, Einstein asked a fellow physicist whether he really believed that the moon exists only when he look at it (Pais 1979). In time, other interpretations have been put forward. For instance, ranked by the popularity in the 2011 survey:

  • Information-based/information-theoretical (see Sect.  13.2).

  • Everett (many worlds and/or many minds).

Interestingly (Becker 2018, p. 287):

Every interpretation has its critics (though the proponents of basically every non-Copenhagen interpretation are usually agreed that Copenhagen is the worst of the lot).

Quantum mechanics has turned some physicists into tea-leaf readers. David Deutsch interprets the interference pattern appearing in the double-slit experiment —a consequence of the bizarre fact that light and matter both behave like waves and particles—as conclusive proof of a new ontology. A breathtakingly vast new ontology, where reality is mind-numbingly bigger but most of it is invisible. In his words (Deutsch 1998, p. 46):

Single-particle interference experiments such as I have been describing show us that the multiverse exists [...].

Not everyone believes that the shadows in those experiments prove that the universe we inhabit is part of a unimaginable ensemble of universes, called the multiverse. The idea goes back to the thesis of Hugh Everett who developed the many-worlds interpretation of quantum mechanics (Everett III 1957; DeWitt and Graham 1973). He denied that the wave function collapses at all and replaces this conceptual cornerstone with the radical new concept of reality splitting into branches, or worlds. In essence, there are infinitely many realities “out there,” existing in “parallel” to ours. “Every possible outcome of every possible quantum choice is realized in one world or another” (Gribbin 1999, 272). As an example, in Schrödinger’s cat thought experiment—aimed at illustrating the absurdity of the Copenhagen interpretation when applied to macroscopic objects, like cats (Schrödinger 1935)—the cat is in a superposition of states, meaning it is simultaneously alive and dead. Once an observer opens the box, the universe branches and two realities emerge, one where the cat is dead and one where the cat is alive. Despite the hefty ontological price one pays to resolve some of the quantum puzzles, the notion of the multiverse is very popular among cosmologists and string theorists (Susskind 2006; Carr 2007). Indeed, one of the last publications of the eminent cosmologist Stephen Hawking, appearing posthumously, argues for a multiverse (Hawking and Hertog 2018). But not everyone is convinced. For instance, the cosmologist Paul Steinhardt (Brockman 2015, p. 56ff.):

A pervasive idea in fundamental physics and cosmology that should be retired: the notion that we live in a multiverse in which the laws of physics and the properties of the cosmos vary randomly from one patch of space to another. According to this view, the laws and properties within our observable universe cannot be explained or predicted because they are set by chance. [...] Over the entire multiverse, there are infinitely many distinct patches. Among these patches, in the words of Alan Guth, “anything that can happen will happen—and it will happen infinitely many times”. Hence, I refer to this concept as a Theory of Anything.

Any observation or combination of observations is consistent with a Theory of Anything. No observation or combination of observations can disprove it. Proponents seem to revel in the fact that the theory cannot be falsified. The rest of the scientific community should be up in arms since an unfalsifiable idea lies beyond the bounds of normal science.

Why, then, consider a Theory of Anything, that allows any possibility, including complicated ones? The motivation is the failure of two favorite theoretical ideas—inflationary cosmology and string theory. Both were thought to produce a unique outcome.

Despite laudable efforts by many theorists to save the theory [inflation] , there is no solid reason known today why inflation should cause our observable universe to be in a pocket with the smoothness and other very simple properties we observe.

Instead of predicting a unique possibility for the vacuum state of the universe and particles and fields that inhabit it, our current understanding of string theory is that there is a complex landscape of vacuum states corresponding to exponentially different kinds of particles and different physical laws. The set of vacuum space contains so many possibilities that, surely, it is claimed, one will include the right amount of vacuum energy and the right kinds of particles and fields.

I suspect that the theories would never have gained the acceptance they have if these problems had been broadly recognized at the outset. Historically, if a theory failed to achieve its goals, it was improved or retired. In this case, though, the commitment to the theories has become so strong that some prominent proponents have seriously advocated moving the goalposts.

I draw the line there. Science is useful insofar as it explains and predicts why things are the way they are and not some other way. [...] A Theory of Anything is useless because it does not rule out any possibility and worthless because it submits to no do-or-die tests.

Because an unfalsifiable Theory of Anything creates unfair competition for real scientific theories, leaders in the field can play an important role by speaking out—making it clear that Anything is not acceptable—to encourage talented young scientists to rise up and meet the challenge.

Taking the many-worlds interpretation to the next level is the many-minds interpretation (Zeh 1970). It proposes that the distinction between the worlds should be made at the level of the mind of an individual observer. In this version the human minds branch into infinity. At the end of the day, every interpretation is exactly that, an interpretation. They all account for the status quo without offering any testable prediction or new tangible insight. There is no way of knowing what is actually going on at the quantum level of reality. The amount of intellectual effort—scientific and philosophical—going into this debate is as astounding as it is inconclusive (Bohm and Stapp 1993; Omnès 1994, 1999; Reichenbach 1998; Kosso 1998; Torretti 1999; Maudlin 2007, 2011; Jaeger 2009; Gisin 2014; Lewis 2016; Rickles 2016; d’Espagnat and Zwirn 2017; Becker 2018). The range of ideas is impressive, incorporating mystic notions. For instance, the preferred status of consciousness (Kosso 1998; Stapp 2011) or the concept of holism (Lewis 2016).

Making matters worse is a batch of troubling quantum experiments. The epitome of such a mid-bending experiment is Wheeler’s delayed choice experiment (Wheeler 1978). In essence, a choice made now by an observer can change or edit the past of a photon. Indeed, a choice made now can, in principle, affect the past at arbitrarily distant times. In the words of Wheeler (quoted in Jacques et al. 2007):

[W]e have a strange inversion of the normal order of time. We, now, by moving the mirror in or out [in the experimental setup] have an unavoidable effect on what we have a right to say about the already past history of that photon.

Again, the experiments show that quantum mechanics is correct (Hellmuth et al. 1987; Lawson-Daku et al. 1996; Jacques et al. 2007; Manning et al. 2015) and, again, the interpretations of the strange reality they tells us about are inconclusive (Becker 2018). In a modified version of the delayed choice experiment, the authors conclude (Ma et al. 2012):

If one views the quantum state as a real physical object, one could get the seemingly paradoxical situation that future actions appear as having an influence on past and already irrevocably recorded events.

Immediately, they also offer their own interpretation proposing a solution to the infuriating enigma:

However, there is never a paradox if the quantum state is viewed as to be no more than a “catalogue of our knowledge.”

Other researchers have tried to combine the effects of quantum mechanics with special relativity. They conclude (Stefanov et al. 2002):

This [...] stresses the oddness of quantum correlations. Not only are they independent of the distance, but also it seems impossible to cast them in any real time ordering. [...] Hence one can’t maintain any causal explanation in which an earlier event influences a later one by arbitrarily fast communication. In this sense, quantum correlations are a basic (i.e. primary) concept, not a secondary concept reducible to that of causality between events: Quantum correlations are directly caused by the quantum state in such a way that one event cannot be considered the “cause” and the other the “effect”.

Finally, the bizarre quantum effects have been brought closer to our classical world by entangling comparably large objects, like buckyballs (Nairz et al. 2003) or millimeter sized diamonds (Lee et al. 2011).

This is a truly unexpected turn of events. By stumbling upon the quantum realm all intuition and common sense is threatened. Determinism, causality, the arrow of time, a mind-independent reality, spatial separation all appear at odds with the quantum reality we can so accurately measure. In the words of the philosopher of science and mathematician, Tim Maudlin (Maudlin 2011, p. 223):

One way or another, God has played us a nasty trick. The voice of Nature has always been faint, but in this case it speaks in riddles and mumbles as well. Quantum theory and Relativity seem not to directly contradict one another, but neither can they be easily reconciled. Something has to give: either Relativity or some foundational element of our world-picture must be modified. Physicists may glory in the challenge of developing radically new theories in which non-locality and relativistic space-time structure can more happily co-exists. Metaphysicians may delight in the prospect of fundamentally new ontologies, and in the consequence testing and stretching of conceptual boundaries. But the real challenge falls to the theologians of physics, who must justify the ways of a Deity who is, if not evil, at least extremely mischievous.

However, there is a glimmer on the horizon. In the survey of Schlosshauer et al. (2013), 76% of respondents identified quantum information as “a breath of fresh air for quantum foundations.” See Chap.  13 for more details on an information-theoretic reality and Sect. 13.2.1 for the implications for quantum mechanics. See also Sect.  14.4.1 for the idea of QBism.

A final contentious issue in the interpretation of quantum mechanics is the notion of free will. For a detailed discussion in the context of quantum mechanics and neuroscience, see Sect. 11.4.1.

4 The Nature of Reality

The analysis of the structure of reality at small and large scales has unearthed a dramatic fact: the nature of reality is unknown to the human mind. The insights of millennia about the nature of reality have been discredited. We are left with glimpses of incompatible fragments of reality floating in a void of the unknown. The very notion of materialism now appears misguided. Our worldview has shattered. The current paradigm shift we are witnessing is momentous. To summarize (Davies and Gribbin 2007):

It is fitting that physics—the science that gave us materialism —should also signal the demise of materialism. During this century the new physics has blown apart the central tenets of materialist doctrine in a sequence of stunning developments. First came the theory of relativity, which demolished Newton’s assumptions about space and time—assumptions that still hold sway in our everyday “common-sense” view of the world. The very arena in which the clockwork Universe acted out its drama was now exposed as subject to shifting and warping. Then came quantum theory, which totally transformed our image of matter. The old assumption that the microscopic world of atoms was simply a scaled-down version of the everyday world had to be abandoned. Newton’s deterministic machine was replaced by a shadowy and paradoxical conjunction of waves and particles, governed by the laws of chance rather than the rigid rules of causality. An extension of the quantum theory, known as quantum field theory, goes beyond even this; it paints a picture in which solid matter dissolves away, to be replaced by weird excitations and vibrations of invisible field energy. In this theory, little distinction remains between material substance and apparent empty space, which itself seethes with ephemeral quantum activity.

It is then perhaps no wonder that more sympathetic physicists, open to the conceptual and philosophical challenges at hand, have developed rather ambiguous relationships with reality. The documentary film Das Netz,Footnote 29 by Lutz Dammbeck, chronicles the emergence of the Internet and highlights potential ties to art and culture. To that end, the filmmaker interviewed various artists, counterculture figures, psychonauts, scientists, and the infamous neo-luddite known as the Unabomber. The physicist and philosopher Heinz von Foerster, known for his foundational work on second-order cybernetics (Von Foerster 2003) and his radical constructivism,Footnote 30 was also featured. He was ninety years old at the time of the interview in 2002. Following is the transcript (begining approximately at one hour and twelve minutes, translation mine):

Von Foerster alleges that there is no foundation to science and that all theories are correct, as they are just stories which are deduced from other stories.

Dammbeck: What will this all lead to? How will things proceed?

Von Foerster: With eternal deduction.

Dammbeck: But there have to be limits somewhere?

Von Foerster: Precisely not. That’s the beauty of it. You can always proceed.

Dammbeck: In logic.

Von Foerster: Yes, precisely.

Dammbeck: But in reality?

Von Foerster: [clearly agitated] Where is this reality? Where do you find it?

This would be his last interview. Von Foerster died in October of that year. Others have been calmer in their assertions. Anton Zeilinger, one of the pioneers of quantum information, states (Zeilinger 2010, p. 266):

So in general, we have to conclude that while some commonsense pictures of the world are not tangible anymore in view of quantum physics, it is not really clear how a new view of the world would work. One point is clear. The predictions of quantum mechanics are so precisely confirmed in all experiments that it is very unlikely, to say the least, that quantum mechanics is an incorrect description of nature.

Even string theory took an unexpected turn with the discovery of the amplituhedron (Arkani-Hamed and Trnka 2014). This is a geometric structure encoding the probability of particle interactions. In detail, the scattering amplitude of particles corresponds to the volume of this object. A simple conception is that amplituhedrons are “Feynman diagrams on steroids.” The only problem is that one has to give up the entire notion of space-time.

4.1 Does Matter Exist?

Perhaps the most obvious trick that reality plays is the illusion of solidity of objects. The tangible aspect of material objects, the very sensation of the physical, is based on cloaking nothingness. Consider a hydrogen atom in its ground state. This is simply a proton orbited by an electron. Note that a proton is made up of other elementary particles (quarks) while the electron is itself an elementary particle. Moreover, approximately 99.95% of the atom’s mass is due to the presence of the heavy proton. The radius of a hydrogen atom is given by the Bohr radius and the newest measurements of the radius of the proton can be found in Pohl et al. (2010). By calculating the corresponding volumes of the hydrogen atom and the proton, utilizing \(V = 4/3 \pi r^3\), the following is revealed: 99.9999999999996% of a hydrogen atom is empty space. Solidity is not a result of an actual matter content, but a property resulting from the interactions of electrons. In fact, loosely stated, chemistry can be considered as the science of studying how the mysterious sharing of atom’s electrons results in tangible molecular structures. Furthermore, we are not even sure about the actual size of protons. The proton radius puzzle is a result of the discrepancy between two methods of measurement (Pohl et al. 2010, 2016). Yet again, we are to conclude Stajic (2016):

This independent discrepancy points to [an] experimental or theoretical error or even to physics beyond the standard model.

Philosophers of science, confronted with the emerging puzzles related to a solid foundation of reality, but not necessarily invested in the current scientific paradigm, have began to raise questions. Davies and Gregersen (2014):

[O]ne begins to wonder whether there is something fundamentally flawed in the idea of a world built up out of matter [p. 50].

One has the sense that, at the end of the day, the speculation of the philosophers and the data from the scientists are pointing in the same surprising direction. At the root of all physical reality is not “primary matter” [p. 72].

Some physicists have also began to doubt.

The renowned physicists Hans-Peter Dürr was first a student, then an assistant and collaborator, and finally a friend of Heisenberg. In 1978, Dürr became his successor as the director of the Max-Planck-Institute for Physics in Munich. Next to his professional work on quantum mechanics, Dürr’s interests included the philosophical implications of quantum physics (Dürr 1986) and environmental issues. Towards the end of his life, he espoused a mystical view of reality and made the bold claim that matter does not exists (Dürr 2012, p. 44f., translation mine):

As a physicist I have spent fifty years—my entire research career—asking myself what exactly underlies matter. The final outcome is simple: Matter does not exist! Therefore I have worked fifty years on a notion that is inexistent. This was an extraordinary experience: Learning that something, whose reality everyone is convinced of, in the end, does not exist.

[...]

Theses crises [in the interpretation of quantum mechanics] are all related to the fact that we have an absolutely incorrect understanding of the world. We have let ourselves get squeezed into a tight conception of reality which does not possess any solutions.

[...]

At the core of our reality there is no foundation, but a source, something alive.

In more cautious words (Davies and Gregersen 2014, p. 72)

One has the sense that, at the end of the day, the speculation of the philosophers and the data of scientists are pointing in the same surprising direction. At the root of all physical reality is not “primary matter” or little atoms of “stuff.”

From a philosophical point of view, there have also been propositions to abandon the notion of tangible elementary particles. The ideas is that physics is forcing us to resort to fictitious concepts in describing fundamental properties of reality. Indeed (Davies and Gribbin 2007, p. 21):

Generally, the more science moves away from common sense, the harder it is to decide what constitutes a mere model and what is supposed to be a faithful description of the real world.

In particular, quantum fields do not yield a satisfactory ontology of the physical world (Kuhlmann 2010). They are a wonderful mathematical tool, but lack any intrinsic reality. However, with what should we then replace the notion of particles? The answer comes from structural realism (see also Sects. 2.2.1 and 6.2.2). The only information we can pry from nature is how things are related to one another. The true nature of the things themselves is always hidden, but the networks of relations can be known and is real. This strong version of structuralism is called ontic structural realism (Kuhlmann 2010; Esfeld and Lam 2010; Morganti 2011). Recall the success of networks in describing complex phenomena (Sect. 5.2.3). Perhaps they can also be utilized in the description of the fundamental realm of reality (in the spirit of the fundamental-algorithmic knowledge generation, seen in Fig.  5.9 of Sect. 5.4.1). This is reminiscent of the spin networks introduced in Sect. . Indeed, “ontic structural realism has become the most fashionable ontological framework for modern physics” (Kuhlmann 2015). Again, in the words of the philosopher of science Meinard Kuhlmann (Kuhlmann 2013):

You may find it is strange that there could be relations without relata—without any objects that stand in that relation. It sounds like having a marriage without spouses. [...] All in all, structural realism is a provocative idea but needs to be developed further before we will know whether it can rescue us from our interpretive trouble.

The problematic notion of matter, its history and philosophy, is also discussed in detail in Davies and Gregersen (2014) in the context of information. This theme will reemerge in Chap. 13.

4.2 Is Time an Illusion?

If the notion of mass appeared thorny, then the idea of time is truly vexing. Yet again, the human mind is confronted with a deep and upsetting paradox. Time is a concept which is so familiar and immediate, so fundamental to existence, yet emerges as inherently incomprehensible, transcending any formal understanding. Indeed (Du Sautoy 2016, p. 241):

Most attempts to define time very quickly run into difficulties that become quite circular. [...] The fourth-century theologian St. Augustine summed up the difficulty in his Confessions: “What then is time? If no one asks me, I know: if I wish to explain it to one that asketh, I know not.”

In the context of science (Cham and Whiteson 2017, p. 140):

[Q]uestions about the nature of time are very deep, and the answers have the potential to shake the very foundations of modern physics. [...] This topic is so out there that very few scientists are working on it directly. It is mostly the province of emeritus professors and a few dedicated younger researchers willing to wade into such risky territory.

Without time, nothing can happen. Yet, what is “now?” And why does it appear to be eternally locked in the delicate and ephemeral transition between the future to the past? Indeed, what aligns the arrow of time in the first place? The nature of time represents the final crisis in the exploration of the true nature of reality. It has challenged physicists and philosophers alike (Falk 2008, p. 272f.):

And yet some of the most basic questions about the nature of time remain unanswered. To begin with, there’s that pesky issue of time “flowing.” Does time truly “pass by” in some tangible way? It is an ancient question, one that begins in earnest with the conflicting views of Parmenides and Heraclitus; one that has troubled the greatest minds from Augustine to Newton, from Kant to Einstein. Is time nothing more than change? Or is it more fundamental—is it the mysterious entity that makes change possible, a kind of foundation on which the universe is built? Or is it just the opposite: as much as we like to speak of the “river of time,” could the river be dry, its flow an illusion?

The laws of physics have always been at odds with time. For instance, the immutable direction of time, flowing from the future into the past, finds no correspondence in the laws of physics. Our formal mathematical representations of reality are all agnostic to the direction of the flow of time. Technically, they are symmetric under time translations.Footnote 31 In detail (Zeh 2007, p. 1):

The asymmetry of Nature under a “reversal of time” (that is, a reversal of motion and change) appears only too obvious, as it deeply affects our own form of existence. If physics is to justify the hypothesis that its laws control everything that happens in Nature, it should be able to explain (or consistently describe) this fundamental asymmetry which defines what may be called a direction in time or even—as will have to be discussed—a direction of time. Surprisingly, the very laws of Nature are in pronounced contrast to this fundamental asymmetry: they are essentially symmetric under time reversal. It is this discrepancy that defines the enigma of the direction of time [...].

The existence of the arrow of time is usually explained as follows. The universe started its existence after the Big Bang in a state of extremely low entropy—characterized by perfect order. Ever since, the second law of thermodynamics is relentlessly driving the universe to higher levels of entropy and disorder. In effect, the arrow of time emerges from the special initial conditions at the birth of the universe (Reichenbach 1999; Zeh 2007). However, in this explanatory framework one has to account for this low entropy beginning. Some theorists have argued that this, in fact, leads to two arrows of time, where there is also a backwards flow of time from the past into the future. The universe is basically time-symmetric (Carroll and Chen 2004). Recall from above, that Feynman took the interpretation very seriously, that a positron is an electron flowing backwards in time (Sect. ). The notion of time flowing backwards has also appeared in another attempt to understand the nature of time. By putting space and time on equal footing, in other words, by restoring the symmetry between space and time, a forwards and backwards arrows of time appear (Vaccaro 2016).

Perhaps the most devastating blow to the concept of time came from Einstein. Special relativity posits that time is a local event for every observer. Depending on the speed and gravitational exposure an inertial frame has, the flow of time will be altered in comparison to other reference frames. In other words, the notion of simultaneity becomes arbitrary. Different observers will always argue about what is happening “now.” This implies that an observer’s potential future can already have unfolded in another observer’s past. Furthermore, the space-time continuum is now an atemporal, static block universe. There exists no “now,” or equivalently, all “nows” are equal (Sect. 3.2.1). Einstein believed in his theory. Two weeks before his death, he wrote: “ For those of us who believe in physics, the distinction between past, present and future is only a stubbornly persistent illusion” (Wuppuluri and Ghirardi 2017, p. 469). In a nutshell (Slezak 2013):

We might think of time flowing from a real past into a not-yet-real future,Footnote 32 but our current theories of space and time teach us that past, present and future are all equally real—and fundamentally indistinguishable. Any sense that our “now” is somehow special, or that time flows past it, is an illusion we create in our heads.

Naturally not everyone agrees, for instance Ellis and Rothman (2010). Einstein’s time-legacy does not stop with special relativity. A particular solution to the equations of general relativity was discovered, allowing for closed time-like curves (Gödel 1949). Essentially, these valid solutions imply the possibility of time travel, backwards in time. Then, the strange quantum experiments described above (Sect.  ) have also strongly indicated that a causal time-ordering is hard to uphold. The specter of retrocausality raises its head. Moreover, what we know about time from quantum gravity is also troubling. Recall that the Wheeler-DeWitt equations (Sect. )—combining general relativity and quantum mechanics and reappearing in loop quantum gravity—leave out time altogether. The theory predicts a static state of the universe. Modern theoretical results, based on merging quantum mechanics and general relativity, have also not been helpful. By entangling quantum clocks with gravity, researchers discovered an inherent fuzziness of time. Any clock that is used to measure time will inadvertently “blur” the flow of time in its surrounding space (Ruiz et al. 2017). Going further, some physicists have argued that time is not a fundamental property of the universe: It is an emergent feature of reality. Specifically, time is a side-effect of quantum entanglement (Page and Wootters 1983; Moreva et al. 2014). Others, again, have simply denied the reality of time.

The philosophy of time offers two basic approaches to time (McTaggart 1927). The A-theory of time simply claims that time flows from the future, through the present, into past. This is what our naive intuition and perception tell us. In contrast, the B-theory of time speaks of a tenseless time. The flow of time is an illusion and the past, the present, and the future are all real. In this sense, Einstein is alive. This counterintuitive view of time is what physics appears to be telling us. The physicist Julian Barbour is a staunch advocate, defending time’s illusory nature (Barbour 1999, 2001). In his words (quoted in Steele 2013):

The flow of time is an illusion, and I don’t know very many scientists and philosophers who would disagree with that, to be perfectly honest. The reason that it is an illusion is when you stop to think, what does it even mean that time is flowing? When we say something flows like a river, what you mean is an element of the river at one moment is in a different place of an earlier moment. In other words, it moves with respect to time. But time can’t move with respect to time—time is time. A lot of people make the mistake of thinking that the claim that time does not flow means that there is no time, that time does not exist. That’s nonsense. Time of course exists. We measure it with clocks. Clocks don’t measure the flow of time, they measure intervals of time. Of course there are intervals of time between different events, that’s what clocks measure.

Time and space are the framework in which we formulate all of our current theories of the universe, but there is some question as to whether these might be emergent or secondary qualities of the universe. It could be that fundamentally the laws of the universe are formulated in terms of some sort of pre-space and time, and that space-time comes out of something more fundamental.

This dichotomy between space-time being emergent, a secondary quality—that something comes out of something more primitive, or something that is at the rock bottom of our description of nature—has been floating around since before my career. John Wheeler believed in and wrote about this in the 1950s.

The problem is that we don’t have any sort of experimental hands on that. You can dream up mathematical models that do this for you, but testing them looks to be pretty hopeless.

Barbour’s ideas are not mainstream. However, his theorizing about time also flows into speculations about quantum gravity (Falk 2008, p. 149):

Part of the problem with “time,” [Barbour] explains, is that our two best theories—general relativity and quantum theory—treat it very differently. “It’s like two children sort of quarreling over a toy they want,” he says. “But the trouble is, each wants something different.” He believes the only solution [for a theory of quantum gravity] is to remove the toy. We have to abandon the notion of time.

His views have influenced some proponents of loop quantum gravity (Falk 2008, p. 149):

[Smolin] has said in the past that Barbour, in particular, has been a “philosophical guru” for him. He especially admires Barbour’s approach to quantum gravity; many who have tackled the issue have displayed “sloppy thinking,” Smolin says, while Barbour has “really thought it through.”

Another pioneer of loop quantum gravity, Carlo Rovelli, agrees with Barbour’s assessment about the reality of time (Callender and Huggett 2001, p. 114):

At the fundamental level, we should, simply, forget time.

A claim Rovelli is also defending in his recent book (Rovelli 2018). In contrast, Smolin is still trying to save time (Smolin 2013; Unger and Smolin 2015). He argues (Brockman 2009, p. 149):

It is becoming clear to me that the mystery of the nature of time is connected with other fundamental questions such as the nature of truth in mathematics and whether there must be timeless laws of nature. Rather than being an illusion, time may be the only aspect of our present understanding of nature that is not temporary and emergent.

Another angle of attack on time can be found in Muller (2016). Now, time is expanding and at the edge of new time is what we experience as “now.”

In the end, the mystery of time breaks down at the border between objectivity and subjectivity. Indeed, this boundary appears as a major fault line in the human narrative of the world. In the words of Schülein and Reitze (2002, p. 174, translation mine):

Objectivity is the illusion that observations are made without an observer.

The objective description of time tells a fundamentally different story to what our senses are telling us. This has lead some thinkers to relabel the problem of time as a problem of consciousness (Falk 2008, p. 273):

Perhaps millions of years of biological evolution, coupled with thousands of years of cultural and linguistic evolution, have shaped our minds in such a way that we imagine such a flow [of time] where none exists.

[...]

Is the passage of time something our brains assemble out of a swirl of sensory data and then present to us as though it were real? Is the process so efficient, perhaps, that we imagine that the finished product was “out there” all along? For some thinkers, the “self” itself is such a construction—in which case time might simply be one facet of a much richer cognitive assembly.

The “now” is the crux. Our consciousness appears to be inevitably embedded in the flow of time —indeed the “now” is the only arena consciousness can act in—whereas reality itself seems less restrictive on the causal ordering of events. This insight can be experimentally and theoretically substantiated. The enigma of consciousness is the topic of the next chapter.