Papers New

Existence of processes violating causal inequalities on time-delocalised subsystems

It has been shown that it is theoretically possible for there to exist quantum and classical processes in which the operations performed by separate parties do not occur in a well-defined causal order. A central question is whether and how such processes can be realised in practice. In order to provide a rigorous argument for the notion that certain such processes have a realisation in standard quantum theory, the concept of time-delocalised quantum subsystem has been introduced. In this paper, we show that realisations on time-delocalised subsystems exist for all unitary extensions of tripartite processes. Remarkably, this class contains processes that violate causal inequalities, i.e., that can generate correlations that witness the incompatibility with definite causal order in a device-independent manner. We consider a known striking example of such a tripartite classical process that has a unitary extension, and study its realisation on time-delocalised subsystems. We then discuss the question of what a violation of causal inequalities implies in this setting, and argue that it is indeed a meaningful concept to show the absence of a definite causal order between the variables of interest.

Controlling wave-particle duality with quantum entanglement

Wave-particle duality and entanglement are two fundamental characteristics of quantum mechanics. All previous works on experimental investigations in wave{particle properties of single photons (or single particles in general) show that a well-defined interferometer setting determines a well-defined property of single photons. Here we take a conceptual step forward and control the wave-particle property of single photons with quantum entanglement. By doing so, we experimentally test the complementarity principle in a scenario, in which the setting of the interferometer is not defined at any instance of the experiment, not even in principle. To achieve this goal, we send the photon of interest (S) into a quantum Mach-Zehnder interferometer (MZI), in which the output beam splitter of the MZI is controlled by the quantum state of the second photon (C), who is entangled with a third photon (A). Therefore, the individual quantum state of photon C is undefined, which implements the undefined settings of the MZI for photon S. This is realized by using three cascaded phase-stable interferometers for three photons. There is typically no well-defined setting of the MZI, and thus the very formulation of the wave-particle properties becomes internally inconsistent.

Two Roads to Retrocausality

In recent years the quantum foundations community has seen increasing interest in the possibility of using retrocausality as a route to rejecting the conclusions of Bell’s theorem and restoring locality to quantum physics. On the other hand, it has also been argued that accepting nonlocality leads to a form of retrocausality. In this article we seek to elucidate the relationship between retrocausality and locality. We begin by providing a brief schema of the various ways in which violations of Bell’s inequalities might lead us to consider some form of retrocausality. We then consider some possible motivations for using retrocausality to rescue locality, arguing that none of these motivations is adequate and that therefore there is no clear reason why we should prefer local retrocausal models to nonlocal retrocausal models. Next, we examine several different conceptions of retrocausality, concluding that `all-at-once’ retrocausality is more coherent than the alternative dynamical picture. We then argue that since the `all-at-once’ approach requires probabilities to be assigned to entire histories or mosaics, locality is somewhat redundant within this picture. Thus we conclude that using retrocausality as a way to rescue locality may not be the right route to retrocausality. Finally, we demonstrate that accepting the existence of nonlocality and insisting on the nonexistence of preferred reference frames leads naturally to the acceptance of a form of retrocausality, albeit one which is not mediated by physical systems travelling backwards in time. We argue that this is the more natural way to motivate retrocausal models of quantum mechanics.

Tabletop Experiments for Quantum Gravity Are Also Tests of the Interpretation of Quantum Mechanics

Recently there has been a great deal of interest in tabletop experiments intended to exhibit the quantum nature of gravity by demonstrating that it can induce entanglement. We argue that these experiments also provide new information about the interpretation of quantum mechanics: under appropriate assumptions, $psi$-complete interpretations will generally predict that these experiments will have a positive result, $psi$-nonphysical interpretations predict that these experiments will not have a positive result, and for $psi$-supplemented models there may be arguments for either outcome. We suggest that a positive outcome to these experimenst would rule out a class of quantum gravity models that we refer to as $psi$-incomplete quantum gravity (PIQG) – i.e. models of the interaction between quantum mechanics and gravity in which gravity is coupled to non-quantum beables rather than quantum beables. We review some existing PIQG models and consider what more needs to be done to make these sorts of approaches more appealing, and finally we discuss a cosmological phenomenon which could be regarded as providing evidence for PIQG models.

Refining embeddings with fill-tuning: data-efficient generalised performance improvements for materials foundation models

Pretrained foundation models learn embeddings that can be used for a wide range of downstream tasks. These embeddings optimise general performance, and if insufficiently accurate at a specific task the model can be fine-tuned to improve performance. For all current methodologies this operation necessarily degrades performance on all out-of-distribution tasks. In this work we present ‘fill-tuning’, a novel methodology to generate datasets for continued pretraining of foundation models that are not suited to a particular downstream task, but instead aim to correct poor regions of the embedding. We present the application of roughness analysis to latent space topologies and illustrate how it can be used to propose data that will be most valuable to improving the embedding. We apply fill-tuning to a set of state-of-the-art materials foundation models trained on $O(10^9)$ data points and show model improvement of almost 1% in all downstream tasks with the addition of only 100 data points. This method provides a route to the general improvement of foundation models at the computational cost of fine-tuning.

Geometry from quantum temporal correlations

In this work, we show how Euclidean 3-space uniquely emerges from the structure of quantum temporal correlations associated with sequential measurements of Pauli observables on a single qubit. Quite remarkably, the quantum temporal correlations which give rise to geometry are independent of the initial state of the qubit, which we show enables an observer to extract geometric data from sequential measurements without the observer having any knowledge of initial conditions. Such results suggest the plausibility that space itself may emerge from quantum temporal correlations, and we formulate a toy model of such a hypothetical phenomenon.

Computing the graph-changing dynamics of loop quantum gravity

In loop quantum gravity (LQG), quantum states of the gravitational field are represented by labelled graphs called spinnetworks. Their dynamics can be described by a Hamiltonian constraint, which modifies the spinnetwork graphs. Fixed graph approximations of the dynamics have been extensively studied, but its full graph-changing action so far remains elusive. The latter, alongside the solutions of its constraint, are arguably the missing features to access physically correct quantum-relativistic phenomenology from canonical LQG. Here, we introduce the first numerical tool that implements graph-changing dynamics via the Hamiltonian constraint. We find new solutions to this constraint and show that some quantum-geometrical observables behave differently than in the graph-preserving truncation. This work aims at fostering a new era of numerical simulations in canonical LQG that, crucially, embrace the graph-changing aspects of its dynamics, laying aside debated approximations.

Taming Thiemann’s Hamiltonian constraint in canonical loop quantum gravity: reversibility, eigenstates and graph-change analysis

The Hamiltonian constraint remains an elusive object in loop quantum gravity because its action on spinnetworks leads to changes in their corresponding graphs. As a result, calculations in loop quantum gravity are often considered unpractical, and neither the eigenstates of the Hamiltonian constraint, which form the physical space of states, nor the concrete effect of its graph-changing character on observables are entirely known. Much worse, there is no reference value to judge whether the commonly adopted graph-preserving approximations lead to results anywhere close to the non-approximated dynamics. Our work sheds light on many of these issues, by devising a new numerical tool that allows us to implement the action of the Hamiltonian constraint without the need for approximations and to calculate expectation values for geometric observables. To achieve that, we fill the theoretical gap left in the derivations of the action of the Hamiltonian constraint on spinnetworks: we provide the first complete derivation of such action for the case of 4-valent spinnetworks, while updating the corresponding derivation for 3-valent spinnetworks. Our derivations also include the action of the volume operator. By proposing a new approach to encode spinnetworks into functions of lists and the derived formulas into functionals, we implement both the Hamiltonian constraint and the volume operator numerically. We are able to transform spinnetworks with graph-changing dynamics perturbatively and verify that volume expectation values have rather different behavior from the approximated, graph-preserving results. Furthermore, using our tool we find a family of potentially relevant solutions of the Hamiltonian constraint. Our work paves the way to a new generation of calculations in loop quantum gravity, in which graph-changing results and their phenomenology can finally be accounted for and understood.

Average mutual information for random fermionic Gaussian quantum states

Studying the typical entanglement entropy of a bipartite system when averaging over different ensembles of pure quantum states has been instrumental in different areas of physics, ranging from many-body quantum chaos to black hole evaporation. We extend such analysis to open quantum systems and mixed states, where we compute the typical mutual information in a bipartite system averaged over the ensemble of mixed Gaussian states with a fixed spectrum. Tools from random matrix theory and determinantal point processes allow us to compute arbitrary k-point correlation functions of the singular values of the corresponding complex structure in a subsystem for a given spectrum in the full system. In particular, we evaluate the average von Neumann entropy in a subsystem based on the level density and the average mutual information. Those results are given for finite system size as well as in the thermodynamic limit.

Dissipation-induced Quantum Homogenization for Temporal Information Processing

Quantum reservoirs have great potential as they utilize the complex real-time dissipative dynamics of quantum systems for information processing and target time-series generation without precise control or fine-tuning of the Hamiltonian parameters. Nonetheless, their realization is challenging as quantum hardware with appropriate dynamics, robustness to noise, and ability to produce target steady states is required. To that end, we propose the disordered quantum homogenizer as an alternative platform, and prove it satisfies the necessary and sufficient conditions — textit{stability} and textit{contractivity} — of the reservoir dynamics, necessary for solving machine learning tasks with time-series input data streams. The results indicate that the quantum homogenization protocol, physically implementable as either nuclear magnetic resonance ensemble or a photonic system, can potentially function as a reservoir computer.