Papers New

How causation is rooted into thermodynamics

The notions of cause and effect are widely employed in science. I discuss why and how they are rooted into thermodynamics. The entropy gradient (i) explains in which sense interventions affect the future rather than the past, and (ii) underpins the time orientation of the subject of knowledge as a physical system. Via these two distinct paths, it is this gradient, and only this gradient, the source of the time orientation of causation, namely the fact the cause comes before its effects.

Markov Chain Monte Carlo methods for graph refinement in Spinfoam Cosmology

We study the behaviour of the Lorentzian Engle-Pereira-Rovelli-Livine spinfoam amplitude with homogeneous boundary data, under a graph refinement going from five to twenty boundary tetrahedra. This can be interpreted as a wave function of the universe, for which we compute boundary geometrical operators, correlation functions and entanglement entropy. The numerical calculation is made possible by adapting the Metropolis-Hastings algorithm, along with recently developed computational methods appropriate for the deep quantum regime. We confirm that the transition amplitudes are stable against such refinement. We find that the average boundary geometry does not change, but the new degrees of freedom correct the quantum fluctuations of the boundary and the correlations between spatial patches. The expectation values are compatible with their geometrical interpretation and the correlations between neighbouring patches decay when computed across different spinfoam vertices.

Gravitational time dilation as a resource in quantum sensing

Atomic clock interferometers are a valuable tool to test the interface between quantum theory and gravity, in particular via the measurement of gravitational time dilation in the quantum regime. Here, we investigate whether gravitational time dilation may be also used as a resource in quantum information theory. In particular, we show that for a freely falling interferometer and for a Mach-Zehnder interferometer, the gravitational time dilation may enhance the precision in estimating the gravitational acceleration for long interferometric times. To this aim, the interferometric measurements should be performed on both the path and the clock degrees of freedom.

Locally mediated entanglement through gravity from first principles

Observing entanglement generation mediated by a local field certifies that the field cannot be classical. This information-theoretic argument is at the heart of the race to observe gravity-mediated entanglement in a `table-top’ experiment. Previous derivations of the effect assume the locality of interactions, while using an instantaneous interaction to derive the effect. We correct this by giving a first principles derivation of mediated entanglement using linearised gravity. The framework is Lorentz covariant — thus local — and yields Lorentz and gauge invariant expressions for the relevant quantum observables. For completeness we also cover the electromagnetic case. An experimental consequence of our analysis is the possibility to observe *retarded* mediated entanglement, which avoids the need of taking relativistic locality as an assumption. This is a difficult experiment for gravity, but could be feasible for electromagnetism. Our results confirm that the entanglement is dynamically mediated by the gravitational field.

The nonequilibrium cost of accurate information processing

Accurate information processing is crucial both in technology and in nature. To achieve it, any information processing system needs an initial supply of resources away from thermal equilibrium. Here we establish a fundamental limit on the accuracy achievable with a given amount of nonequilibrium resources. The limit applies to arbitrary information processing tasks and arbitrary information processing systems subject to the laws of quantum mechanics. It is easily computable and is expressed in terms of an entropic quantity, which we name reverse entropy, associated to a time reversal of the information processing task under consideration. The limit is achievable for all deterministic classical computations and for all their quantum extensions. As an application, we establish the optimal tradeoff between nonequilibrium and accuracy for the fundamental tasks of storing, transmitting, cloning, and erasing information. Our results set a target for the design of new devices approaching the ultimate efficiency limit, and provide a framework for demonstrating thermodynamical advantages of quantum devices over their classical counterparts.

Watching the Clocks: Interpreting the Page-Wootters Formalism and the Internal Quantum Reference Frame Programme

We discuss some difficulties that arise in attempting to interpret the Page-Wootters and Internal Quantum Reference Frames formalisms, then use a ‘final measurement’ approach to demonstrate that there is a workable single-world realist interpretation for these formalisms. We note that it is necessary to adopt some interpretation before we can determine if the ‘reference frames’ invoked in these approaches are operationally meaningful, and we argue that without a clear operational interpretation, such reference frames might not be suitable to define an equivalence principle. We argue that the notion of superposition should take into account the way in which an instantaneous state is embedded in ongoing dynamical evolution, and this leads to a more nuanced way of thinking about the relativity of superposition in these approaches. We conclude that typically the operational content of these approaches appears only in the limit as the size of at least one reference system becomes large, and therefore these formalisms have an important role to play in showing how our macroscopic reference frames can emerge out of wholly relational facts.