Published on April 9, 2025 8:20 AM GMT
THE SIMULATION HYPOTHESIS
In 2003, Nick Bostrom used a probabilistic argument to introduce what became known as the simulation hypothesis: either advanced civilizations rarely emerge, they rarely run simulations, or we’re almost certainly in a simulation.(1)
FINE-TUNING AS SIMULATION EVIDENCE
Since then, public discourse has suggested that the universe’s fine-tuning—the precise constants and conditions enabling life—supports the idea of being in a simulation. If there’s a “one in billions” chance of such fine-tuning occurring at random, then fine-tuning may hint at the existence of a fine-tuner, such as a programmer/simulator.
DISMISSAL OF FINE-TUNING AS SIMULATION EVIDENCE
Most philosophers and cosmologists rely on multiverse hypotheses to argue against the idea that fine-tuning implies a fine-tuner. If billions of universes exist or have existed, our life-conducive outlier isn’t surprising.(2) It is true that some, like Ian Hacking and Roger White, counter the counterarguments by arguing that multiverse hypotheses engage in logical fallacies—such as the Inverse Gambler’s Fallacy.(3) Yet most academics continue to dismiss fine-tuning with multiverse hypotheses. So is there any other physical evidence even mildly supportive of the hypothesis that we may be in a simulation?
QUANTUM ENTANGLEMENT AS SIMULATION EVIDENCE
During the double-slit experiment, photons (or electrons) exhibit wave-like interference when unobserved, but act as particles when their path is measured. More specifically, a particle begins in a superposition of two Gaussian wavepackets exiting two slits. Without measurement, these wavepackets spread and overlap, producing an interference pattern on a detection screen. However, introducing a measurement apparatus to determine which slit the particle passes through disrupts this pattern.
In the article "Measurement-induced Decoherence and Information in Double-Slit Interference" the authors explore how measurement precision affects this outcome, using a tunable apparatus interacting with the particle.(4) When the apparatus perfectly distinguishes the particle’s path, it entangles with the system, decohering the superposition into a state in which interference disappears. As precision decreases, partial interference reemerges, reflecting less information transfer to the apparatus. When precision reaches zero (no measurement), no path information is gained, allowing interference to persist.
The authors describe that what is likely happening is that measurement-induced decoherence drives the quantum-to-classical transition. When the apparatus or environment gains "which-path" information, it entangles with the particle, collapsing the superposition and erasing interference. The authors further suggest this reflects a broader principle: information acquisition by an observer or apparatus alters the quantum state.
This aligns with ideas like quantum Darwinism, in which classical reality emerges from redundant information storage, making interference observable only when path information remains inaccessible.
Now that we’ve reviewed the quantum entanglement and quantum Darwinism interpretation of the double-slit experiment, does it support or detract from the hypothesis that we may be in a simulation.
QE AS SIMULATION EVIDENCE
We were able to question fine-tuning as simulation hypothesis evidence with multiverse hypotheses, due to selection. As far as dismissing the multiverse dismissal, a reasonable case can be made with the Inverse Gambler’s Fallacy and other logical fallacies, but there is no consensus. Yet with the quantum entanglement interpretation of the double-slit experiment, could we still live in a universe without it? It seems reasonable that we could, so we cannot dismiss this behavior as a selection effect in a multiverse.
Along similar lines, there would be no clear Darwinian advantage to an entire universe keeping track of what we know/measure, simply to be able to collapse the superposition of a photon’s or electron’s path upon the exceedingly rare times when intelligent life actually measures the path. In fact, keeping track of such measurement, with continuous monitoring for it, would seem to be extremely labor-intensive from an information processing perspective—rather akin to having one’s phone always listening for “Hey Siri." On the other hand, a simulated world made to resemble a universe would already need to be keeping track of everything, including what we observe, by necessity of maintaining the simulation. “Throwing in” a "bonus" collapse of the superposition would require negligible processing, and might even be seen as something of a winking gesture. Once the superposition is collapsed, one could argue the program would save something in terms of “rendering,” and although such savings would be negligible, they would not harm processing expenses either.
Another sign of quantum entanglement support for the idea we are in a simulation comes from the fact that a programmer/simulator would know what it means to know (observe/measure), whereas the possibility that a universe would simply evolve at random to be able to track this, without any clear advantage for doing so, seems unlikely. While humans have evolved to know what it means to know, simply through natural selection of random mutations, there has been a tremendous advantage to doing so. Yet as noted, there is no conceivable advantage to a universe doing so.
Is any of this firm physical evidence that we are in a simulation? No, but it may tip the scale slightly, and that may be worth knowing. And now we have to ask, if there is a simulator, do they know we know?
CITATIONS
1-Nick Bostrom, “Are You Living in a Computer Simulation?,” Philosophical Quarterly 53, no. 211 (2003): 243–255.
2-Anthony Aguirre and Max Tegmark, "Multiple Universes, Cosmic Coincidences, and Other Dark Matters," Journal of Cosmology and Astroparticle Physics 2005, no. 04 (April 2005): 001, doi:10.1088/1475-7516/2005/04/001.
3-Roger White, "Fine-Tuning and Multiple Universes," British Journal for the Philosophy of Science 51, no. 2 (June 2000): 246–257.
4-Joshua Kincaid, Kyle McLelland, and Michael Zwolak, "Measurement-induced Decoherence and Information in Double-Slit Interference" (unpublished manuscript, n.d.), Department of Physics, Oregon State University, Corvallis, OR, and Center for Nanoscale Science and Technology, National Institute of Standards and Technology, Gaithersburg, MD.
Discuss