Qiss

Black Hole Entropy and Planckian Discreteness

A brief overview of the discovery that macroscopic black holes are thermodynamical systems is presented. They satisfy the laws of thermodynamics and are associated with a temperature and an entropy equal to one quarter of their horizon area in Planck units. They emit black body radiation and slowly evaporate as a consequence of Heisenberg’s uncertainty principle. The problem of understanding the microscopic source of their large entropy, as well as the nature of their final fate after evaporation, are discussed from the perspective of approaches to quantum gravity that predict discreteness at the Planck scale. We review encouraging first steps in computing black hole entropy and briefly discuss their implications for the black hole information puzzle.

Quantum teleportation of a genuine vacuum-one-photon qubit generated via a quantum dot source

Quantum state teleportation represents a pillar of quantum information and a milestone on the roadmap towards quantum networks with a large number of nodes. Successful photonic demonstrations of this protocol have been carried out employing different qubit encodings. However, demonstrations in the Fock basis encoding are challenging, due to the impossibility of creating a coherent superposition of vacuum-one photon states on a single mode with linear optics. Previous realizations using such an encoding strongly relied on ancillary modes of the electromagnetic field, which only allowed the teleportation of subsystems of entangled states. Here, we enable quantum teleportation of genuine vacuum-one photon states avoiding ancillary modes, by exploiting coherent control of a resonantly excited semiconductor quantum dot in a micro-cavity. Within our setup, we can teleport vacuum-one-photon qubits and perform entanglement swapping in such an encoding. Our results may disclose new potentialities of quantum dot single-photon sources for quantum information applications.

Flexible Error Mitigation of Quantum Processes with Data Augmentation Empowered Neural Model

Neural networks have shown their effectiveness in various tasks in the realm of quantum computing. However, their application in quantum error mitigation, a crucial step towards realizing practical quantum advancements, has been restricted by reliance on noise-free statistics. To tackle this critical challenge, we propose a data augmentation empowered neural model for error mitigation (DAEM). Our model does not require any prior knowledge about the specific noise type and measurement settings and can estimate noise-free statistics solely from the noisy measurement results of the target quantum process, rendering it highly suitable for practical implementation. In numerical experiments, we show the model’s superior performance in mitigating various types of noise, including Markovian noise and Non-Markovian noise, compared with previous error mitigation methods. We further demonstrate its versatility by employing the model to mitigate errors in diverse types of quantum processes, including those involving large-scale quantum systems and continuous-variable quantum states. This powerful data augmentation-empowered neural model for error mitigation establishes a solid foundation for realizing more reliable and robust quantum technologies in practical applications.

Quantum Nonlocality: Multi-copy Resource Inter-convertibility & Their Asymptotic Inequivalence

Quantum nonlocality, pioneered in Bell’s seminal work and subsequently verified through a series of experiments, has drawn substantial attention due to its practical applications in various protocols. Evaluating and comparing the extent of nonlocality within distinct quantum correlations holds significant practical relevance. Within the resource theoretic framework this can be achieved by assessing the inter-conversion rate among different nonlocal correlations under free local operations and shared randomness. In this study we, however, present instances of quantum nonlocal correlations that are incomparable in the strongest sense. Specifically, when starting with an arbitrary many copies of one nonlocal correlation, it becomes impossible to obtain even a single copy of the other correlation, and this incomparability holds in both directions. Remarkably, these incomparable quantum correlations can be obtained even in the simplest Bell scenario, which involves two parties, each having two dichotomic measurements setups. Notably, there exist an uncountable number of such incomparable correlations. Our result challenges the notion of a ‘unique gold coin’, often referred to as the ‘maximally resourceful state’, within the framework of the resource theory of quantum nonlocality, which has nontrivial implications in the study of nonlocality distillation.

Algebras and Hilbert spaces from gravitational path integrals: Understanding Ryu-Takayanagi/HRT as entropy without invoking holography

Recent works by Chandrasekaran, Penington, and Witten have shown in various special contexts that the quantum-corrected Ryu-Takayanagi (RT) entropy (or its covariant Hubeny-Rangamani-Takayanagi (HRT) generalization) can be understood as computing an entropy on an algebra of bulk observables. These arguments do not rely on the existence of a holographic dual field theory. We show that analogous-but-stronger results hold in any UV-completion of asymptotically anti-de Sitter quantum gravity with a Euclidean path integral satisfying a simple and familiar set of axioms. We consider a quantum context in which a standard Lorentz-signature classical bulk limit would have Cauchy slices with asymptotic boundaries $B_L sqcup B_R$ where both $B_L$ and $B_R$ are compact manifolds without boundary. Our main result is then that (the UV-completion of) the quantum gravity path integral defines type I von Neumann algebras ${cal A}^{B_L}_L$, ${cal A}^{B_R}_{R}$ of observables acting respectively at $B_L$, $B_R$ such that ${cal A}^{B_L}_L$, ${cal A}^{B_R}_{R}$ are commutants. The path integral also defines entropies on ${cal A}^{B_L}_L, {cal A}^{B_R}_R$. Positivity of the Hilbert space inner product then turns out to require the entropy of any projection operator to be quantized in the form $ln N$ for some $N in {mathbb Z}^+$ (unless it is infinite). As a result, our entropies can be written in terms of standard density matrices and standard Hilbert space traces. Furthermore, in appropriate semiclassical limits our entropies are computed by the RT-formula with quantum corrections. Our work thus provides a Hilbert space interpretation of the RT entropy. Since our axioms do not severely constrain UV bulk structures, they may be expected to hold equally well for successful formulations of string field theory, spin-foam models, or any other approach to constructing a UV-complete theory of gravity.

Observable Thermalization: Theory, Numerical and Analytical Evidence

Predicting whether an observable will dynamically evolve to thermal equilibrium in an isolated quantum system is an important open problem, as it determines the applicability of thermodynamics and statistical mechanics. The Observable Thermalization framework has been proposed as a solution, characterizing observables that thermalize using an observable-specific maximum entropy principle. In this paper, we achieve three results. First, we confirm the dynamical relaxation of local observables towards maximum entropy, in a 1D Ising chain. Second, we provide the most general solution to the maximization problem and numerically verify some general predictions about equilibrium behavior in the same model. Third, we explore the emergence and physical meaning of an observable-specific notion of energy. Our results mark significant progress towards a fully predictive theory of thermalization in isolated quantum systems and open interesting questions about observable-specific thermodynamic quantities.

Comparing coherent and incoherent models for quantum homogenization

Here we investigate the role of quantum interference in the quantum homogenizer, whose convergence properties model a thermalization process. In the original quantum homogenizer protocol, a system qubit converges to the state of identical reservoir qubits through partial-swap interactions, that allow interference between reservoir qubits. We design an alternative, incoherent quantum homogenizer, where each system-reservoir interaction is moderated by a control qubit using a controlled-swap interaction. We show that our incoherent homogenizer satisfies the essential conditions for homogenization, being able to transform a qubit from any state to any other state to arbitrary accuracy, with negligible impact on the reservoir qubits’ states. Our results show that the convergence properties of homogenization machines that are important for modelling thermalization are not dependent on coherence between qubits in the homogenization protocol. We then derive bounds on the resources required to re-use the homogenizers for performing state transformations. This demonstrates that both homogenizers are universal for any number of homogenizations, for an increased resource cost.

Gravity Mediated Entanglement between Oscillators as Quantum Superposition of Geometries

Protocols for observing gravity induced entanglement typically comprise the interaction of two particles prepared either in a superposition of two discrete paths, or in a continuously delocalized (harmonic oscillator) state of motion. An important open question has been whether these two different approaches allow to draw the same conclusions on the quantum nature of gravity. To answer this question, we analyse using the path-integral approach a setup that contains both features: a superposition of two highly delocalized center of mass states. We conclude that the two usual protocols are of similar epistemological relevance. In both cases the appearance of entanglement, within linearised quantum gravity, is due to gravity being in a highly non-classical state: a superposition of distinct geometries.

Taxonomy for Physics Beyond Quantum Mechanics

We propose terminology to classify interpretations of quantum mechanics and models that modify or complete quantum mechanics. Our focus is on models which have previously been referred to as superdeterministic (strong or weak), retrocausal (with or without signalling, dynamical or non-dynamical), future-input-dependent, atemporal and all-at-once, not always with the same meaning or context. Sometimes these models are assumed to be deterministic, sometimes not, the word deterministic has been given different meanings, and different notions of causality have been used when classifying them. This has created much confusion in the literature, and we hope that the terms proposed here will help to clarify the nomenclature. The general model framework that we will propose may also be useful to classify other interpretations and modifications of quantum mechanics. This document grew out of the discussions at the 2022 Bonn Workshop on Superdeterminism and Retrocausality.

Macroscopic quantum entanglement between an optomechanical cavity and a continuous field in presence of non-Markovian noise

Probing quantum entanglement with macroscopic objects allows to test quantum mechanics in new regimes. One way to realize such behavior is to couple a macroscopic mechanical oscillator to a continuous light field via radiation pressure. In view of this, the system that is discussed comprises an optomechanical cavity driven by a coherent optical field in the unresolved sideband regime where we assume Gaussian states and dynamics. We develop a framework to quantify the amount of entanglement in the system numerically. Different from previous work, we treat non-Markovian noise and take into account both the continuous optical field and the cavity mode. We apply our framework to the case of the Advanced Laser Interferometer Gravitational-Wave Observatory (Advanced LIGO) and discuss the parameter regimes where entanglement exists, even in the presence of quantum and classical noises.