In the histories formulation of quantum theory, sets of coarse-grained histories, that are called consistent, obey classical probability rules. It has been argued that these sets can describe the semi-classical behaviour of closed quantum systems. Most physical scenarios admit multiple different consistent sets and one can view each consistent set as a separate context. Using propositions from different consistent sets to make inferences leads to paradoxes such as the contrary inferences first noted by Kent [Physical Review Letters, 78(15):2874, 1997]. Proponents of the consistent histories formulation argue that one should not mipropositions coming from different consistent sets in making logical arguments, and that paradoxes such as the aforementioned contrary inferences are nothing else than the usual microscopic paradoxes of quantum contextuality as first demonstrated by Kochen and Specker theorem. In this contribution we use the consistent histories to describe a macroscopic (semi-classical) system to show that paradoxes involving contextuality (mixing different consistent sets) persist even in the semi-classical limit. This is distinctively different from the contextuality of standard quantum theory, where the contextuality paradoxes do not persist in the semi-classical limit. Specifically, we consider different consistent sets for the arrival time of a semi-classical wave packet in an infinite square well. Surprisingly, we get consistent sets that disagree on whether the motion of the semi-classical system, that started within a subregion, ever left that subregion or not. Our results point to the need for constraints, additional to the consistency condition, to recover the correct semi-classical limit in this formalism and lead to the motto `all consistent sets are equal’, but `some consistent sets are more equal than others’.
Erasure is fundamental for information processing. It is also key in connecting information theory and thermodynamics, as it is a logically irreversible task. We provide a new angle on this connection, noting that there may be an additional cost to erasure, that is not captured by standard results such as Landauer’s principle. To make this point we use a model of irreversibility based on Constructor Theory – a recently proposed generalization of the quantum theory of computation. The model uses a machine called the “quantum homogenizer”, which has the ability to approximately realise the transformation of a qubit from any state to any other state and remain approximately unchanged, through overall entirely unitary interactions. We argue that when performing erasure via quantum homogenization there is an additional cost to performing the erasure step of the Szilard’s engine, because it is more difficult to reliably produce pure states in a cycle than to produce mixed states. We also discuss the implications of this result for the cost of erasure in more general terms.
This work provides a relativistic, digital quantum simulation scheme for both $2+1$ and $3+1$ dimensional quantum electrodynamics (QED), based on a discrete spacetime formulation of theory. It takes the form of a quantum circuit, infinitely repeating across space and time, parametrised by the discretization step $Delta_t=Delta_x$. Strict causality at each step is ensured as circuit wires coincide with the lightlike worldlines of QED; simulation time under decoherence is optimized. The construction replays the logic that leads to the QED Lagrangian. Namely, it starts from the Dirac quantum walk, well-known to converge towards free relativistic fermions. It then extends the quantum walk into a multi-particle sector quantum cellular automata in a way which respects the fermionic anti-commutation relations and the discrete gauge invariance symmetry. Both requirements can only be achieved at cost of introducing the gauge field. Lastly the gauge field is given its own electromagnetic dynamics, which can be formulated as a quantum walk at each plaquette.