Back

A Computational View on the Asymmetry of Time

Why does time move forward? I have a perspective which I think holds under both Copenhagen-style views and the many-worlds interpretation of quantum mechanics. My view is more computational in nature. Specifically, when we think of the universe as a computational system that applies physical laws to its current state to produce the next, with specific constraints, we can see that the direction of time emerges.

Defining the Laws of Physics as a Stateless Computer

To help define what this computer looks like, let's first define \(S\). \(S\) denotes a space of all physically admissible states of the universe (density operators on a Hilbert space \(H\) for example, if the universe admits a standard quantum description). A particular state at time \(t\) is \(s_t \in S\). The laws of physics, our computer, can be thought of as a transition function \(f_{\Delta t}\) that takes the current state of the universe and produces the next state:

\[ f_{\Delta t} : \mathcal{S} \to \mathcal{S}, \qquad s_{t+\Delta t} = f_{\Delta t}(s_t). \]

The computer is stateless in that the update rule has Markov form: the next state is computed from only the current state, with no priveleged access to the system's history beyond what's encoded in \(s_{t}\). The computer is branch-relative (record-relative) state, and does not have access to the full global state including phase relations between branches, if the many-worlds interpretation is true.

Notice that, if the transition function \(f_{\Delta t}\) were bijective (i.e., one-to-one and onto), the computer could also compute the previous state by applying the inverse function:

\[ f_{\Delta t}^{-1} : \mathcal{S} \to \mathcal{S}, \qquad s_{t-\Delta t} = f_{\Delta t}^{-1}(s_t). \]

This would imply that the computer could simulate both forward and backward in time with equal ease, as each state would have a unique predecessor and successor deterministically defined by the transition function.

An Irreversible System

At this point, I've set up a framework for computing the universe, but I haven't yet justified why that computation should move in a specific direction. Don't worry, we're here.

If the universe were a perfectly closed system evolving under a reversible law (Hamiltonian evolution classically, unitary evolution quantum mechanically), then the update map would be bijective in principle. In that case, backwards evolution is well-defined at the microphysical level, and the asymmetry of time would not be explainable from the perspective of the computer.

The asymmetry I'm pointing at shows up when we include a genuinely irreversible update rule in \(f_{\Delta t}\). In both the Copenhagen-style views and the many-worlds interpretation, given our computer's constraints, there is an irreversible process that occurs during quantum measurement. For the Copenhagen-style views, this is the collapse of the wavefunction, a fundamental irreversible process. For the many-worlds interpretation, while the global state evolves unitarily, the branch-relative state that our computer has access to undergoes an effective collapse, as it loses access to the full superposition as a result of decoherence and only retains the outcome of the measurement. This is irreversible more because of the constraints we have put on our computer. In either case, this introduces an irreversibility into the transition function \(f_{\Delta t}\). Basically, because of this, we have:

\[ \exists s_t^{(1)} \neq s_t^{(2)} \in \mathcal{S} \;\text{s.t.}\; f_{\Delta t}(s_t^{(1)}) = f_{\Delta t}(s_t^{(2)}). \]

This renders the inverse computation ill-defined, as the computer has no mechanism to compute the correct precursor among multiple plausible candidates to the current state. For many-worlds, think of there being distinct microstates that lead to the same macrostate. Thus, the computer can only reliably compute forward in time.

Conclusion

Although this isn’t a proof of the arrow of time, I like this computational lens because it isolates one clean issue: invertibility is fragile under information-discarding updates. If the effective laws available to an embedded observer include any many-to-one step (fundamental collapse in a Copenhagen-style view, branch-restricted description in many-worlds), then backward simulation is ill-posed.

The next thing I want to understand is how this computational asymmetry relates to more standard arrows (thermodynamic, causal, cosmological). My hunch is that the “many-to-one-ness” I’m describing is closely related to the thermodynamic arrow of time, which is often associated with the increase of entropy, but I'll have to look deeper into it.