Papers New

Time, space and matter in the primordial universe

Time, space, and matter are categories of our reasoning, whose properties appear to be fundamental. However, these require a scrutiny as in the extreme regime of the primordial universe these present quantum properties. What does it mean for time to be quantum? What does it mean for space? Are space and time disappearing, or what is disappearing are simply the categories we have been using to understand them? Concepts such as the superposition of causal structures or the quantum granularity of space require our attention and should be clarified to understand the physics of the primordial universe. The novelty that this brings requires us to reflect on matter as well: How can matter be defined on a granular space? Is quantum gravity hinting us toward considering new types of matter? The answers to these questions, that touch the foundations of physics and the very concepts with which we organize our understanding of reality, require in the end of the journey to confront ourselves with empirical data. And for that, the universe itself provides us with the best of possible laboratories.

Metriplectic geometry for gravitational subsystems

In general relativity, it is difficult to localise observables such as energy, angular momentum, or centre of mass in a bounded region. The difficulty is that there is dissipation. A self-gravitating system, confined by its own gravity to a bounded region, radiates some of the charges away into the environment. At a formal level, dissipation implies that some diffeomorphisms are not Hamiltonian. In fact, there is no Hamiltonian on phase space that would move the region relative to the fields. Recently, an extension of the covariant phase space has been introduced to resolve the issue. On the extended phase space, the Komar charges are Hamiltonian. They are generators of dressed diffeomorphisms. While the construction is sound, the physical significance is unclear. We provide a critical review before developing a geometric approach that takes into account dissipation in a novel way. Our approach is based on metriplectic geometry, a framework used in the description of dissipative systems. Instead of the Poisson bracket, we introduce a Leibniz bracket – a sum of a skew-symmetric and a symmetric bracket. The symmetric term accounts for the loss of charge due to radiation. On the metriplectic space, the charges are Hamiltonian, yet they are not conserved under their own flow.

Radiative corrections to the Lorentzian EPRL spin foam propagator

We numerically estimate the divergence of several two-vertex diagrams that contribute to the radiative corrections for the Lorentzian EPRL spin foam propagator. We compute the amplitudes as functions of a homogeneous cutoff over the bulk quantum numbers, fixed boundary data, and different Immirzi parameters, and find that for a class of two-vertex diagrams, those with fewer than six internal faces are convergent. The calculations are done with the numerical framework sl2cfoam-next.

Tabletop Experiments for Quantum Gravity Are Also Tests of the Interpretation of Quantum Mechanics

Recently there has been a great deal of interest in tabletop experiments intended to exhibit the quantum nature of gravity by demonstrating that it can induce entanglement. We argue that these experiments also provide new information about the interpretation of quantum mechanics: under appropriate assumptions, $psi$-complete interpretations will generally predict that these experiments will have a positive result, $psi$-nonphysical interpretations predict that these experiments will not have a positive result, and for $psi$-supplemented models there may be arguments for either outcome. We suggest that a positive outcome to these experimenst would rule out a class of quantum gravity models that we refer to as $psi$-incomplete quantum gravity (PIQG) – i.e. models of the interaction between quantum mechanics and gravity in which gravity is coupled to non-quantum beables rather than quantum beables. We review some existing PIQG models and consider what more needs to be done to make these sorts of approaches more appealing, and finally we discuss a cosmological phenomenon which could be regarded as providing evidence for PIQG models.

The nonequilibrium cost of accurate information processing

Accurate information processing is crucial both in technology and in nature. To achieve it, any information processing system needs an initial supply of resources away from thermal equilibrium. Here we establish a fundamental limit on the accuracy achievable with a given amount of nonequilibrium resources. The limit applies to arbitrary information processing tasks and arbitrary information processing systems subject to the laws of quantum mechanics. It is easily computable and is expressed in terms of an entropic quantity, which we name reverse entropy, associated to a time reversal of the information processing task under consideration. The limit is achievable for all deterministic classical computations and for all their quantum extensions. As an application, we establish the optimal tradeoff between nonequilibrium and accuracy for the fundamental tasks of storing, transmitting, cloning, and erasing information. Our results set a target for the design of new devices approaching the ultimate efficiency limit, and provide a framework for demonstrating thermodynamical advantages of quantum devices over their classical counterparts.

Watching the Clocks: Interpreting the Page-Wootters Formalism and the Internal Quantum Reference Frame Programme

We discuss some difficulties that arise in attempting to interpret the Page-Wootters and Internal Quantum Reference Frames formalisms, then use a ‘final measurement’ approach to demonstrate that there is a workable single-world realist interpretation for these formalisms. We note that it is necessary to adopt some interpretation before we can determine if the ‘reference frames’ invoked in these approaches are operationally meaningful, and we argue that without a clear operational interpretation, such reference frames might not be suitable to define an equivalence principle. We argue that the notion of superposition should take into account the way in which an instantaneous state is embedded in ongoing dynamical evolution, and this leads to a more nuanced way of thinking about the relativity of superposition in these approaches. We conclude that typically the operational content of these approaches appears only in the limit as the size of at least one reference system becomes large, and therefore these formalisms have an important role to play in showing how our macroscopic reference frames can emerge out of wholly relational facts.

Two Roads to Retrocausality

In recent years the quantum foundations community has seen increasing interest in the possibility of using retrocausality as a route to rejecting the conclusions of Bell’s theorem and restoring locality to quantum physics. On the other hand, it has also been argued that accepting nonlocality leads to a form of retrocausality. In this article we seek to elucidate the relationship between retrocausality and locality. We begin by providing a brief schema of the various ways in which violations of Bell’s inequalities might lead us to consider some form of retrocausality. We then consider some possible motivations for using retrocausality to rescue locality, arguing that none of these motivations is adequate and that therefore there is no clear reason why we should prefer local retrocausal models to nonlocal retrocausal models. Next, we examine several different conceptions of retrocausality, concluding that `all-at-once’ retrocausality is more coherent than the alternative dynamical picture. We then argue that since the `all-at-once’ approach requires probabilities to be assigned to entire histories or mosaics, locality is somewhat redundant within this picture. Thus we conclude that using retrocausality as a way to rescue locality may not be the right route to retrocausality. Finally, we demonstrate that accepting the existence of nonlocality and insisting on the nonexistence of preferred reference frames leads naturally to the acceptance of a form of retrocausality, albeit one which is not mediated by physical systems travelling backwards in time. We argue that this is the more natural way to motivate retrocausal models of quantum mechanics.

Entanglement-asymmetry correspondence for internal quantum reference frames

In the quantization of gauge theories and quantum gravity, it is crucial to treat reference frames such as rods or clocks not as idealized external classical relata, but as internal quantum subsystems. In the Page-Wootters formalism, for example, evolution of a quantum system S is described by a stationary joint state of S and a quantum clock, where time-dependence of S arises from conditioning on the value of the clock. Here, we consider (possibly imperfect) internal quantum reference frames R for arbitrary compact symmetry groups, and show that there is an exact quantitative correspondence between the amount of entanglement in the invariant state on RS and the amount of asymmetry in the corresponding conditional state on S. Surprisingly, this duality holds exactly regardless of the choice of coherent state system used to condition on the reference frame. Averaging asymmetry over all conditional states, we obtain a simple representation-theoretic expression that admits the study of the quality of imperfect quantum reference frames, quantum speed limits for imperfect clocks, and typicality of asymmetry in a unified way. Our results shed light on the role of entanglement for establishing asymmetry in a fully symmetric quantum world.

Controlling wave-particle duality with quantum entanglement

Wave-particle duality and entanglement are two fundamental characteristics of quantum mechanics. All previous works on experimental investigations in wave{particle properties of single photons (or single particles in general) show that a well-defined interferometer setting determines a well-defined property of single photons. Here we take a conceptual step forward and control the wave-particle property of single photons with quantum entanglement. By doing so, we experimentally test the complementarity principle in a scenario, in which the setting of the interferometer is not defined at any instance of the experiment, not even in principle. To achieve this goal, we send the photon of interest (S) into a quantum Mach-Zehnder interferometer (MZI), in which the output beam splitter of the MZI is controlled by the quantum state of the second photon (C), who is entangled with a third photon (A). Therefore, the individual quantum state of photon C is undefined, which implements the undefined settings of the MZI for photon S. This is realized by using three cascaded phase-stable interferometers for three photons. There is typically no well-defined setting of the MZI, and thus the very formulation of the wave-particle properties becomes internally inconsistent.

Determinism Beyond Time Evolution

Physicists are increasingly beginning to take seriously the possibility of laws outside the traditional time-evolution paradigm; yet our understanding of determinism is still predicated on a forwards time-evolution picture, making it manifestly unsuited to the diverse range of research programmes in modern physics. In this article, we use a constraint-based framework to set out a generalization of determinism which does not presuppose temporal directedness, distinguishing between strong, weak and delocalised holistic determinism. We discuss some interesting consequences of these generalized notions of determinism, and we show that this approach sheds new light on the long-standing debate surrounding the nature of objective chance.