How can long term advancements in quantum computing change people’s relationship with time itself (even make it SEEM slower). think longtermist
12:12 AM
Synthesized quantum computing pathways to reshape temporal perception and longevity
Synthesized quantum computing pathways to reshape temporal perception and longevity
This is a great question to sit with. Let me trace several convergent pathways, from near-term plausible to deeply speculative.
The Longevity Channel (most concrete)
The most direct route: QC makes time seem slower by giving you radically more of it. Molecular simulation is the killer app here — classically intractable protein folding landscapes, metabolic network dynamics, and drug-target interactions all become tractable when you can simulate quantum chemistry natively. Lloyd’s linear systems algorithm already suggests exponential speedups for the kinds of differential equations governing biological aging cascades.
If you take your dynamical systems view of aging seriously — attractors flattening, loss of regenerative basin depth — then QC-enabled molecular simulation could let us map those attractor landscapes at a resolution classical computation never reaches. You stop treating aging as a collection of pathologies and start engineering the topology of the attractor space itself. Add a century or three to healthspan and your psychological relationship with a decade transforms completely. A decade becomes what a year feels like now. Time dilates through sheer abundance.
The Cognitive Bandwidth Channel
Here’s where your BCI work becomes relevant. Subjective time perception scales roughly with information processing rate — time “slows” when you’re processing more per unit clock-time (why novel experiences feel longer, why childhood feels vast in retrospect).
QC-augmented neural interfaces could dramatically increase the bandwidth of conscious experience per second of wall-clock time. Not just through faster processing, but through qualitatively different processing — quantum-enhanced pattern recognition, parallel hypothesis evaluation, or even novel sensory modalities synthesized in real-time. If your experienced information density per second doubles, a year of subjective experience happens in six months of clock time. You don’t live longer — you live denser.
This connects to Reilly and Lloyd’s work on the physical resource costs of computation. Their paper on physical complexity and black hole quantum computers is essentially asking: what are the thermodynamic limits on how much computation (and therefore, potentially, experience) you can pack into a given spacetime volume? The answer appears to be vastly more than biological neural tissue currently achieves.
The Deep Simulation Channel
QC enables simulation of complex systems across timescales that are otherwise experientially inaccessible to humans. Climate over millennia, stellar evolution, geological processes, civilizational dynamics — these become things you can interact with rather than abstractly model.
This matters for temporal perception because humans are terrible at viscerally grasping deep time. We flatten everything beyond a generation or two into abstraction. If quantum simulation lets you build immersive, high-fidelity models of processes spanning thousands or millions of years — and interact with them, scrub through them, feel the contingency at each branching point — you develop an intuitive relationship with long timescales that’s currently only available through contemplative practice or psychedelic states (both of which you’re familiar with).
Longtermism currently fails partly because deep time doesn’t feel real. QC-powered simulation could make it feel real.
The Temporal Physics Channel (most speculative)
Reilly literally teaches a course on closed timelike curves and their quantum mechanical interpretation. Lloyd has published on quantum mechanics of CTCs — showing that quantum information processing in the presence of CTCs avoids the grandfather paradox through self-consistent solutions (the Deutsch model) or post-selection (Lloyd’s own P-CTC model).
This is currently theoretical, but the deep implication is that quantum information doesn’t respect the classical arrow of time the way we naively assume. Retrocausal interpretations of QM (which QMM-type frameworks also gesture at, however differently) suggest that the temporal asymmetry we experience is emergent rather than fundamental. If QC matures enough to experimentally probe these regimes — say, through quantum simulations of CTCs or verification of retrocausal protocols — our conceptual relationship with time changes even if we can’t build a time machine. The realization that time is more like a navigable dimension than an inexorable current would be civilizationally transformative.
The Synthesis
The longtermist frame that ties these together: QC doesn’t just speed things up — it changes the grain size of temporal experience available to a civilization.
Right now we’re trapped in a narrow temporal bandwidth: we experience seconds-to-decades directly, and everything outside that range is abstraction. QC, through longevity (expanding the upper bound), cognitive enhancement (increasing density within bounds), deep simulation (making other timescales experiential), and foundational physics (revealing time’s deeper structure), could expand that bandwidth by orders of magnitude in both directions.
A civilization with that expanded temporal bandwidth would relate to time the way we relate to space after the invention of telescopes and microscopes — suddenly the very large and very small become real rather than theoretical. Time would seem slower not because the clock changes, but because you’d have so much more resolution within each moment, and so many more moments to work with, that urgency itself transforms into something more like patient navigation.
“showing that quantum information processing in the presence of CTCs avoids the grandfather paradox through self-consistent solutions (the Deutsch model) or post-selection (Lloyd’s own P-CTC model). This is currently theoretical, but the deep implication is that quantum information doesn’t respect the classical arrow of time the way we naively assume. Retrocausal interpretations of QM (which QMM-type frameworks also gesture at, however differently) suggest that the temporal asymmetry we experience is emergent rather than fundamental. If QC matures enough to experimentally probe these regimes — say, through quantum simulations of CTCs or verification of retrocausal protocols — our conceptual relationship with time changes even if we can’t build a time machine. The realization that time is more like a navigable dimension than an inexorable current” explain more
12:15 AM
Unpacked quantum physics foundations underlying retrocausality and temporal mechanics
Unpacked quantum physics foundations underlying retrocausality and temporal mechanics
Let me unpack the layers here.
Deutsch’s CTC Model
Deutsch (1991) asked: if a quantum system could travel along a closed timelike curve (allowed by GR — Gödel metrics, Kerr black hole interiors, certain wormhole solutions), what happens to quantum information? The grandfather paradox in classical terms is a logical contradiction — you go back and prevent your own birth. Deutsch showed that if you treat the CTC-traversing system quantum mechanically, you can always find a self-consistent density matrix at the CTC entrance that equals the density matrix at the CTC exit. The system literally converges on a fixed point.
The strange part: this self-consistency is achieved through mixed states. Even if you send in a pure state, the CTC interaction generically produces a mixed state. The system “knows” its own future boundary condition and adjusts. There’s no paradox because the quantum state was never incompatible with its own past — it was the unique state that’s consistent with the entire closed loop.
The computational implications are wild. Deutsch showed CTCs + quantum computing = solving problems in PSPACE efficiently. That’s vastly beyond what standard QC (BQP) can do. The CTC acts like a computational oracle that can enforce self-consistency constraints that would otherwise require exhaustive search.
Lloyd’s P-CTC Model
Lloyd disagreed with Deutsch’s approach on physical grounds. Deutsch’s model allows CTCs to create entropy (the mixed states), break linearity of quantum mechanics, and enable nonlinear cloning-like operations. Lloyd proposed post-selected CTCs (P-CTCs) instead, based on a teleportation analogy.
The idea: instead of a physical wormhole, imagine the CTC as a quantum teleportation protocol where the post-selection on the Bell measurement enforces the backward-in-time channel. You prepare a Bell pair, one half goes through the “past” path and one through the “future” path, and you post-select on a specific measurement outcome that enforces consistency.
P-CTCs preserve linearity, don’t create entropy, and are equivalent to post-selected quantum computation (which Aaronson showed solves PP — also enormously powerful, but differently than Deutsch’s PSPACE). Crucially, Lloyd argued P-CTCs are more physically motivated because they emerge from standard quantum mechanics plus a boundary condition, rather than requiring modified quantum theory.
Here’s the key conceptual point: both models achieve paradox-freedom not by forbidding time travel, but by showing that quantum mechanics naturally resolves temporal inconsistencies. The resolution mechanism is different — fixed-point iteration vs. post-selection — but both demonstrate that quantum information has a fundamentally more flexible relationship with temporal ordering than classical information does.
Why This Isn’t Just Abstract
Several things make this more than philosophy:
Quantum teleportation already implements something structurally identical to P-CTCs. When Alice and Bob share an entangled pair and Alice performs a Bell measurement, the “influence” on Bob’s state is instantaneous and, in certain formulations, can be described as information traveling backward through the entangled pair’s shared past light cone. P-CTCs formalize this: post-selected teleportation is a closed timelike curve in the information-theoretic sense, even without curved spacetime. Lloyd’s group experimentally demonstrated P-CTC-like behavior using photonic systems around 2011 — they showed that post-selected quantum circuits reproduce the predictions of the P-CTC model.
The arrow of time in quantum mechanics is weaker than people think. The Schrödinger equation is time-symmetric. Measurement introduces apparent irreversibility, but even that’s debatable — in the two-state vector formalism (Aharonov, Bergmann, Lebowitz), every quantum system is described by both a forward-evolving state from preparation and a backward-evolving state from future measurement. The “present” state is the intersection of future and past boundary conditions. This isn’t fringe — it’s a mathematically equivalent reformulation of standard QM that makes retrocausality explicit.
Weak measurements provide experimental access. Aharonov’s group has shown that weak measurements can reveal the two-state vector structure — you can detect the influence of future measurement choices on present weak measurement outcomes. This doesn’t enable signaling (no causality violation in the communication sense), but it demonstrates that the quantum state at time tt carries information about what happens at t+Δtt+Δt. The arrow of time, at the quantum level, is more like a statistical tendency emerging from thermodynamics than a fundamental law.
The Deep Implication
Put it all together: quantum information doesn’t propagate strictly forward in time. It propagates along whatever paths are consistent with both initial and final boundary conditions. What we experience as “the flow of time” is a macroscopic, thermodynamic, high-decoherence phenomenon — the same way temperature is real but emergent from particle statistics.
This connects to something you might find resonant from the consciousness research angle. If subjective time experience is tied to information integration (IIT-style or otherwise), and if the underlying information dynamics are fundamentally time-symmetric, then the felt sense of temporal flow is itself a model the brain constructs — not a direct perception of a physical asymmetry. Contemplative traditions have pointed at this forever (the “eternal now,” time as illusion), and psychedelic states famously dissolve temporal linearity. The physics is starting to say something structurally similar: time’s arrow is a feature of the macroscopic description, not the fundamental dynamics.
Where QC enters: quantum computers are the first technology that operates in the regime where these time-symmetric dynamics are functionally relevant rather than washed out by decoherence. Every time you run a quantum algorithm, you’re exploiting the fact that quantum amplitudes interfere across what classically would be “different times” in the computation. A quantum computer doesn’t just compute faster — it computes in a way that treats the temporal ordering of operations as more fluid than a classical machine can. CTC simulations, retrocausal protocol verification, and weak measurement amplification are all things QC could eventually make routine rather than exotic.
The civilization-level shift isn’t “we built a time machine.” It’s that the foundational technology of the economy operates according to principles where time isn’t what you thought it was — and gradually, that percolates into how people think about time, the way Newtonian mechanics eventually percolated into Enlightenment philosophy and Darwinian dynamics into social theory.
Explain all of this
Outside of Turing what do you and Seth talk about? ANSWER • by Michele Reilly We consider what memory actually means inside quantum machines, and in the universe in full generality. We’ve been thinking through definitions of causality and we study the science of how to store quantum memories. More concretely, how to lengthen these memory times, using quantum protocols. More abstractly, how concepts of quantum memory and heat can be combined to explain why we discover ourselves to be near the dawn of the universe — a research program inspired by the idea of a “speed prior” but aiming for a more rigorous and foundational solution. The concept of memory is very closely related to that of entropy in information theory. These are much better pinned-down concepts than “time” in physics — not to mention concepts such as “consciousness”. If it turns out that memory and entropy are more fundamental concepts — if everything we wish to explain in terms of time, or in terms of consciousness is better explainable in terms of memory and entropy — that would certainly be interesting.
Show less
Feb 22
Synthesized quantum memory concepts for comprehensive explanation
Synthesized quantum memory concepts for comprehensive explanation
This is a dense passage. Let me unpack it layer by layer.
Quantum memory — the concrete part
A quantum memory is a system that stores a quantum state (superposition, entanglement, and all) over time without it decohering into classical noise. This is one of the hardest engineering problems in quantum computing — coherence times in current hardware range from microseconds (superconducting qubits) to minutes (trapped ions, nuclear spins). When Reilly says they study “how to lengthen memory times using quantum protocols,” she means active error-correction and dynamical decoupling schemes that fight decoherence — not just better hardware isolation, but information-theoretic techniques where you use quantum operations to preserve quantum information against thermodynamic decay.
Memory and entropy — the conceptual bridge
Here’s the key insight she’s pointing to. In information theory, entropy measures uncertainty or missing information. Memory is the retention of information over time. These two concepts are deeply linked: storing a memory means maintaining a low-entropy record that’s correlated with some past event, against the thermodynamic tendency toward entropy increase.
Landauer’s principle makes this concrete — erasing one bit of memory requires dissipating at least kT ln(2) of heat. Memory and thermodynamics are not metaphorically connected, they’re physically the same thing. A memory is a physical system held out of equilibrium with its environment, and maintaining it costs free energy.
The cosmological question — why are we near the dawn?
This is the most provocative part. The “Boltzmann brain” problem and related puzzles ask: given that the universe will exist for an incomprehensibly long time, why do we find ourselves so early in its history? Standard cosmology says the universe started in an extraordinarily low-entropy state (the Past Hypothesis), but doesn’t explain why.
The speed prior she references is Jürgen Schmidhuber’s idea — a computable probability measure that assigns higher probability to universes generated by faster programs. It’s an attempt to explain observed physical regularities through computational complexity: simple, fast-to-compute universes are more likely than complex ones. But it’s more of a suggestive framework than a rigorous physical theory.
What Reilly and Lloyd seem to be pursuing is: replace the speed prior’s computational framing with a thermodynamic one grounded in quantum memory. The argument would go something like — the capacity to form and retain memories (quantum or classical) requires specific thermodynamic conditions. Near the Big Bang, the entropy gradient is steepest, meaning the universe’s capacity to support memory-forming systems (like us) is maximal. As the universe approaches heat death, the entropy gradient flattens, memory becomes impossible, and there are no observers to “find themselves” anywhere. So we’re not coincidentally early — we’re early because memory requires the thermodynamic disequilibrium that’s most available early on.
Memory as more fundamental than time
This is the philosophical core. In physics, “time” is surprisingly poorly defined — general relativity makes it coordinate-dependent, quantum mechanics treats it as a parameter rather than an observable, and there’s no agreed-upon quantum theory of time. Reilly is suggesting that memory (a well-defined information-theoretic concept) and entropy (a well-defined thermodynamic/statistical concept) might be the actual primitives, with our experience of time being derivative of them.
This has echoes of the thermal time hypothesis (Connes and Rovelli) — the idea that time evolution in generally covariant systems is selected by the thermal state, not the other way around. If the arrow of time is just the direction in which memories can form (i.e., the direction of entropy increase), then time is a consequence of memory, not the container for it.
The consciousness aside
Her parenthetical about consciousness is deliberately cautious — she’s saying that rather than trying to explain consciousness directly (which is notoriously intractable), it might be more productive to ask whether everything we attribute to consciousness is actually better explained by the physics of memory and entropy. This is methodologically conservative: work with concepts that have rigorous physical definitions, and see how far they take you before invoking anything new.
The overall research program is essentially: build a rigorous theory connecting quantum information, thermodynamics, and cosmology, using memory and entropy as the foundational concepts rather than time or consciousness. It’s ambitious — it’s trying to bridge quantum error correction, Landauer-scale thermodynamics, and cosmological fine-tuning into a single framework.
Quantum error correction??
Feb 22
Deciphered user curiosity and prepared comprehensive explanation
Deciphered user curiosity and prepared comprehensive explanation
Right — I should clarify why I brought that up, because it’s not an obvious connection.
The basic problem: quantum states are absurdly fragile. Every interaction with the environment — a stray photon, thermal vibration, anything — can collapse superpositions and destroy entanglement. This is decoherence, and it’s the central obstacle to quantum computing. You can’t just copy a qubit to back it up either, because the no-cloning theorem forbids copying unknown quantum states.
How QEC works: instead of storing information in one physical qubit, you encode it across many entangled qubits — a logical qubit. The simplest example: the 3-qubit bit-flip code encodes |0⟩ as |000⟩ and |1⟩ as |111⟩. If one qubit flips, you can detect which one flipped by measuring parity checks (comparing pairs) without ever measuring the actual encoded information. You then correct it and the logical state is preserved.
Real quantum errors are continuous rotations, not just bit flips, and you also have phase errors. The surface code (currently the leading approach) uses a 2D lattice of physical qubits with “syndrome” measurements that detect errors without disturbing the encoded state. Current estimates suggest you need roughly 1,000 physical qubits per logical qubit, which is why people talk about needing millions of physical qubits for useful computation.
Why I connected it to Reilly/Lloyd: quantum error correction is, at its core, the science of preserving quantum memory against thermodynamic decay. Every QEC cycle is fighting entropy — the environment wants to entangle with your qubits and spread the information irreversibly into thermal noise. The correction operations push entropy back out, maintaining the low-entropy encoded state at the cost of energy dissipation (Landauer’s principle again).
So when Reilly talks about “the science of how to store quantum memories” and “lengthening memory times using quantum protocols,” QEC is the most developed framework for doing exactly that. And the deep question her research seems to be asking is whether that same physics — the battle between memory preservation and entropic decay — scales up to explain cosmological structure, not just qubit registers.