Quantum Mechanics is not a generalization of probability theory, but it is definitely a generalization of classical mechanics since it involves non-deterministic transformations to the state of the system. For instance, the time evolution may be non-deterministic unlike in classical mechanics.
There are three major metaphysical views of time [1]: presentism, eternalism and possibilism. The possibilism consists in considering the presentism for the future and the eternalism for the past, so it is inconsistent with a time translation symmetry. The presentism view coincides with the Hamiltonian formalism of physics, that the state of the system is defined by a point in the phase space. When the time evolution of the system is deterministic it traces a phase space trajectory for the system, however the definition of the state of the system does not involve time, i.e. only the present exists1. The eternalism view coincides with the Lagrangian formalism of physics, that the state of the system is defined by a function of time. When the time evolution of the system is deterministic, this function of time coincides with the phase-space trajectory of the classical Hamiltonian formalism and so which metaphysical view of time we use is irrelevant from an experimental point of view (in the deterministic case).
But when the time-evolution of the system is non-deterministic, we may have a hard time studying the time-evolution from the Lagrangian formalism and/or eternalism metaphysical view. The key fact about Quantum Mechanics which makes it incompatible with the eternalism/Lagrangian point of view is that the time-evolution is not necessarily a stochastic process, i.e. there is not necessarily a collection of random events indexed by time2. We only apply one non-deterministic transformation of the state of the system, however there are many different transformations we can choose from and the set of choices is indexed by a parameter we call time, which is fine from the presentism/Hamiltonian point of view since only the present exists.
Note that a random experiment always involves a preparation followed by a measurement. For instance, we shake a dice in our hand and throw it over a table until it stops (preparation), then we check the position where it stopped (measurement).
If we just throw the dice without shaking our hand, the probability distribution for the measurement outcome is different than if we shake our hand. There is nothing mysterious about this: two different preparations lead to two different probability distributions. Whether or not we actually do the measurement does not change anything, what changes the probability distribution is the preparation.
Then we can think about a preparation which is function of an element of a symmetry group, for instance translation in time. From the point of view of probability theory or experimental physics, this is a valid option. However, it is important to note that this preparation function of time is not a stochastic process in time. A stochastic process in time is a set of random experiments indexed by time, while in the preparation which is function of time we have a single random experiment dependent on the parameter time. As an example, consider a) throwing the dice 10 times, one time per minute during 10 minutes and b) shake the dice in our hand for a number of minutes between 0 and 10 and then throw the dice once. The preparation in b) is dependent on the time parameter , while in a) the time selects the one of the many identically prepared experiments which was done at the selected time.
Note that the experiments a) and b) above are different but can be combined: we could do many random experiments, each of them would be dependent on a parameter. This fact is important in Section 12.
In the remaining of this section, we comment on conditioned probability and the random walk. It is well-known that quantum mechanics can be described as the Wick-rotation of a Wiener stochastic process [2]. In other words, the time evolution in Quantum Mechanics is a Wiener process for imaginary time. This is the origin of the Feynman’s path integral approach to Quantum Mechanics and Quantum Field Theory.
Since the Wiener process is one of the best known Lévi processes—a Lévi process is the continuous-time analog of a random walk—this fact often leads to an identification of Quantum Mechanics with a random walk. In particular, it often leads to an identification of the probabilities calculated in Quantum Mechanics with conditioned probabilities—the next state in a random walk is conditioned by the previous state.
Certainly, the usefulness of group theory is common to both a random walk and to Quantum Mechanics and this unavoidably leads to similarities between a random walk and Quantum Mechanics. However, imaginary time is very different from real time and thus the probabilities calculated in Quantum Mechanics are not necessarily conditioned probabilities in a random walk.
In order to relate a random walk (or any other stochastic process) with Quantum Mechanics correctly, we need the probability distribution for the complete paths of the random walk. Then, we can use a wave-function parametrization of the probability distribution for the complete paths of the random walk. Finally, we can apply quantum methods to this wave-function. The result is a Quantum Stochastic Process [3], which is not a generalization of a stochastic process due to the wave-function collapse, but merely the parametrization of a stochastic process with a wave-function.
Now we are able to prove one of the main results of this paper, namely that there is a group action of a Wigner’s symmetry group on the probability distribution for the state of a system, if and only if the Wigner’s symmetry group transforms deterministic (probability) distributions into deterministic (probability) distributions. A corollary is that time translation in Quantum Mechanics is a stochastic process if and only if it is deterministic. This mathematical fact is overlooked by the assumptions of both the Bell’s theorem and the Einstein-Podolsky-Rosen (EPR) paradox.
As it was discussed in Section 3, Wigner’s theorem [4][5][6] implies that the action of a symmetry group on the wave-function is necessarily linear and unitary. In Section 4, we showed that the action of a symmetry group on the wave-function is deterministic if and only if and commute for all events and for all the elements of the group, where is a projection-valued-measure.
This means that is a deterministic transformation if and only if for all such that .
Now we check the necessary and sufficient conditions for the action of a symmetry group on the wave-function to correspond to an action on the corresponding probability distribution.
That is, if we start with some probability distribution , then the action of each element of the group on the wave-function will produce (after the collapse) a different probability distribution . The composition of the actions of two group elements on the probability distribution is given by the succession of the two random experiments corresponding to and : .
However, Wigner’s theorem [4][5][6] implies that the action of a symmetry group on the wave-function is necessarily linear and unitary, thus .
Thus there is a group action of the symmetry group on the probability distribution if and only if for any pure density matrix and any event and group element .
The equality above is equivalent to , where are the elements of the matrix . We can see that if is a deterministic transformation, then the equality is satisfied, since for all such that . On the other hand, if is a non-deterministic transformation then for some such that , we have . Then for , we get , i.e. there is no group action of the symmetry group on the probability distribution.
The concept of (ir)reversible process from thermodynamics also needs a careful discussion in quantum mechanics. A non-deterministic symmetry transformation, when acting on a deterministic ensemble increases the entropy of the ensemble after the wave-function collapse and therefore must be an irreversible transformation. Yet, a symmetry transformation always has an inverse symmetry transformation, because it is included in a symmetry group, so it must be considered reversible in some sense.
The way out of this apparent contradiction is the role of time in the quantum formalism, which was discussed in Sections 11 and 12. In the ensemble interpretation, the individual system is entirely defined by a standard phase-space, which implies that the time plays no fundamental role in quantum mechanics nor in classical Hamiltonian mechanics. Then, time-evolution in quantum mechanics is not a stochastic process unless it is deterministic. Therefore, there is not a probability distribution for each time (or for other parameter corresponding to the symmetry group).
If we consider a stochastic process with only two probability distributions corresponding to the initial and final times, then the complete symmetry transformation is irreversible (if it is non-deterministic and it acts in a deterministic ensemble). However, this does not imply that it is a “bad” symmetry, because no stochastic process can be defined in between the initial and final times. On the other hand, if the symmetry group contains only deterministic transformations then a stochastic process can be defined in between the initial and final times and such process is reversible, as expected.
The Einstein-Podolsky-Rosen (EPR) main claim [7] (namely, that Quantum Mechanics is an incomplete description of physical reality), is defended by reducing to absurd the negation of the main claim, i.e. by reducing to absurd that position (Q) and momentum (P) are not simultaneous elements of reality. In the EPR article it is stated: “one would not arrive at our conclusion if one insisted that two or more physical quantities can be regarded as simultaneous elements of reality only when they can be simultaneously measured or predicted.[...] This makes the reality of P and Q depend upon the process of measurement carried out on the first system, which does not disturb the second system in any way. No reasonable definition of reality could be expected to permit this.”
The reduction to absurd of the negation of the claim, could only be a satisfactory argument if the claim itself (namely, the quantities position and momentum of the same particle are simultaneous elements of reality, despite they cannot be simultaneously measured or predicted) would not be absurd as well. But the claim itself raises eyebrows to say the least, once we remember that (in Quantum Mechanics, by definition) measuring the position with infinite precision completely erases any knowledge about the momentum of the same particle.
In Quantum Mechanics as in classical Hamiltonian mechanics, the state of an individual system is a point in a phase space, and the phase space is both the domain and image of the deterministic physical transformations. As in any statistical theory, we may know only the probability distribution for the state of the individual system, instead of knowing the state of the individual system. The relation between quantum mechanics and a statistical theory is clear: the wave-function is a parametrization for any probability distribution [8].
There are two kinds of incompleteness in a non-Markov stochastic process. The two kinds of incompleteness are in correspondence with the two concepts: stochastic and non-Markov, respectively.
1) Stochastic: From the point of view of (classical) information theory [9], the root of probabilities (i.e. non-determinism) is by definition the absence of information. Statistical methods are required whenever we lack complete information about a system, as so often occurs when the system is complex [10]. Thus we can convert a deterministic process to a stochastic process unambiguously (using trivial probability distributions); but we cannot convert a stochastic process into a deterministic process unambiguously since we need new information 3.
2) non-Markov: any non-Markov stochastic process can be described as a Markov stochastic process where some variables defining the state of the system are hidden (i.e. unknown) [11][12]. Conversely, by definition any irreducible 4 Markov process where some variables defining the state of the system are hidden will give rise to a non-Markov process. For instance, the physical phenomena which generates examples of Brownian motion is deterministic and thus Markov, but real-world Brownian motion is often non-Markov (because we cannot measure the state of the system completely [13][14]) despite the fact that the Brownian motion is one of the most famous examples of a Markov process.
In reference [15] (authored by A. Einstein and contemporary of the EPR paradox) the two kinds of incompleteness are clearly distinguished:
"**[...]* I believe that the [quantum] theory is apt to beguile us into error in our search for a uniform basis for physics, because, in my belief, it is an incomplete representation of real things, although it is the only one which can be built out of the fundamental concepts of force and material points (quantum corrections to classical mechanics). The incompleteness of the representation is the outcome of the statistical nature (incompleteness) of the laws. I will now justify this opinion.*"
The incompleteness of the representation corresponds to the non-Markov kind, while the incompleteness of the laws corresponds to the stochastic kind. By definition, in Quantum Mechanics any sequence of measurements is a Markov stochastic process (thus it has the stochastic kind of incompleteness) 5. Note that any non-Markov stochastic process can be described as a Markov stochastic process where some variables defining the state of the system are hidden (i.e. unknown) [11][12].
Since Quantum Mechanics does not have the non-Markov kind of incompleteness, position and momentum can only be simultaneous elements of reality in another theory very different from Quantum Mechanics. That both the claim and its negation are absurd, is strong evidence that some of the assumptions leading to the Einstein-Podolsky-Rosen (EPR) paradox [7] do not hold.
So, why did the author tried to justify (using the EPR paradox [7], among other arguments) that in Quantum Mechanics the stochastic kind of incompleteness necessarily leads to a non-Markov kind of incompleteness?
The following paragraph from the same reference [15] suggests that the author was trying to favor the cause that any future theoretical basis should be deterministic, not just Markov (since statistical mechanics is often Markov).
“There is no doubt that quantum mechanics has seized hold of a beautiful element of truth, and that it will be a test stone for any future theoretical basis, in that it must be deducible as a limiting case from that basis, just as electrostatics is deducible from the Maxwell equations of the electromagnetic field or as thermodynamics is deducible from classical mechanics. However, I do not believe that quantum mechanics will be the starting point in the search for this basis, just as, vice versa, one could not go from thermodynamics (resp. statistical mechanics) to the foundations of mechanics.”
However and as discussed in Section 4, there is no mathematical argument that suggests that in general a deterministic model is more fundamental than a stochastic one, quite the opposite. Since the wave-function is merely a possible parametrization of any probability distribution [8], we also cannot claim that a deterministic model is more fundamental than Quantum Mechanics. Thus, the stochastic kind of incompleteness is harmless.
So, the EPR paradox appears as an attempt to justify a mathematical statement (that a deterministic model is more fundamental than Quantum Mechanics) with arguments from physics (trying to link to the non-Markov kind of incompleteness), for which no mathematical arguments could be found. Note that a statement referring to any future theoretical basis is essentially a mathematical statement because the physical model is any (since the theoretical basis is any).
However, it is a failed attempt because it missed the fact discussed in Section 12, that the time evolution is a stochastic process if and only if it is deterministic.
In the EPR paradox, there is no probability distribution for the state of system after the spatial separation of the entangled particles and before the transformation involved in the measurement takes place, because the time evolution (being in this case non-deterministic) is not a stochastic process. We can only consider the probability distribution for the state of system after the spatial separation of the entangled particles and after the transformation involved in the measurement takes place. This is overall a non-local physical transformation since it involves the spatial separation of the entangled particles. But it does not violate relativistic causality, since both the spatial separation of the entangled particles and the transformation involved in the measurement do not by themselves violate relativistic causality, so their composition does not violate causality either.
Unlike many popular no-go arguments [16], we are not arguing against the requirement that a physical theory should be complete, in fact we claim that Quantum Mechanics is a complete statistical theory (as defined by EPR).
Note that Bohr already declared Quantum Mechanics as a “complete” theory, however he did it at the cost of a radical revision of the classical notions of causality and physical reality [17]. He wrote: “Indeed the finite interaction between object and measuring agencies conditioned by the very existence of the quantum of action entails —because of the impossibility of controlling the reaction of the object on the measuring instruments if these are to serve their purpose—the necessity of a final renunciation of the classical ideal of causality and a radical revision of our attitude towards the problem of physical reality.” [17] Such notion of a “complete” theory mostly favors the EPR claim: the only way that Quantum Mechanics could be complete is if it is incompatible with the classical notions of causality and physical reality. Thus from a logic point of view, there is no disagreement between Einstein and Bohr, their disagreement is about what basic features an acceptable theory should have, whether or not it should be compatible with the classical notions of causality and physical reality.
In contrast, the fact—that the time evolution is a stochastic process if and only if it is deterministic—which was overlooked is perfectly compatible with the classical notions of physical reality (because Quantum Mechanics has a standard phase-space) and causality (as we will show in Section 15). We claim that Quantum Mechanics—being non-deterministic and thus a generalization of classical mechanics—does not entail a radical departure from the basic features that an acceptable theory should have, according to EPR [7]. In fact in Quantum Mechanics and in classical Hamiltonian mechanics, the state of an individual system is a point in a phase space, and the phase space is both the domain and image of the deterministic physical transformations.
The only known theory consistent with the experimental results in high energy physics [18] is a quantum gauge field theory which is mathematically ill-defined [19]. Due to the mathematically illness, the relation of such a theory with Quantum Mechanics is still object of debate and it will be addressed soon in another article by the present author.
In the mean time we will have to consider a free system, which suffices to address the EPR paradox. For a free system, we know well what is relativistic Quantum Mechanics [20]. The time evolution of the wave-function is described by the Dirac equation for a free particle, which is a real (i.e. non-complex) equation.
Relativistic causality is satisfied in relativistic Quantum Mechanics, meaning that there is a propagator which vanishes for a space-like propagation [20]. In other words, the probability that the system moves faster than light is null.
A deterministic theory compatible with relativistic Quantum Mechanics is one which when applied to an ensemble of free systems, will reproduce the statistical predictions of Quantum Mechanics.
Since in relativistic Quantum Mechanics the probability that the system moves faster than light is null, then no system (described by the deterministic theory) in the ensemble moves faster than light. Thus any deterministic theory compatible with relativistic Quantum Mechanics necessarily respects relativistic causality. The question we left open here and address in the next section, is whether one such deterministic theory exists.
Does a deterministic theory—consistent with the non-deterministic time evolution of Quantum Mechanics—exists?
The answer is yes, and we will build one example of such deterministic theory in this section.
In an experimental setting, we always have a discrete set of possible outcomes and thus Quantum Mechanics always predicts a cumulative distribution function. This allows us to apply the inverse-transform sampling method [21] for generating pseudo-random numbers consistently with the probability distribution predicted by Quantum Mechanics.
An experiment in Quantum mechanics always involves the repetition of an experimental procedure many times. In the deterministic theory however, each time we execute the experimental procedure we are not executing exactly the same experimental procedure. We consider a number (any number will do) which will be the seed of the pseudo-random number generator and then we generate pseudo-random numbers consistently with the probability distribution predicted by Quantum Mechanics. The experimental procedure is: 1) generate one pseudo-random number and 2) modify the state of the system accordingly with the pseudo-random number.
In the case of relativistic Quantum Mechanics, the probability of violating relativistic causality is null. Thus, the experimental procedure never violates relativistic causality. The modifications of the state of the system are however necessarily not infinitesimal since the phase space of the experimental setting is discrete. This doe not violate relativistic causality, since the finite modifications to the state of the system occur in finite intervals of time.
We can however consider intervals of time as small as we like and thus modifications to the state of the system as small as we like. The only requirement for this is that the computational resources involved in the pseudo-random number generation are as large as needed (which is valid from a logical point of view). Note that since time evolution in quantum mechanics is not necessarily a stochastic process, we will often have that a sequence of experimental procedures executed at regular and small intervals of time produces different statistical data than than just one experimental procedure executed at once after the same total time has passed (e.g. in the double-slit experiment). But this cannot be considered a radical departure of the classical notion of physical reality, since in the (very old) presentism view of classical Hamiltonian mechanics, the phase space (i.e. the physical reality) does not involve the notion of time [1]. Moreover when the time evolution is deterministic then it is a stochastic process, therefore if we study only deterministic transformations then we can recover the eternalism view of classical Lagrangian mechanics without any conflict with relativistic causality. For instance, this implies that in the double-slit experiment we can in principle reconstruct the trajectory of each particle and conclude about which slit the particle has went through.
From a logical point of view, this deterministic theory is valid and by definition it always agrees with the experimental predictions of Quantum Mechanics, thus it is experimentally indistinguishable from Quantum Mechanics.
From the metaphysics point of view, this deterministic theory is unacceptable, since it involves pseudo-random number generation. For instance, in the double-slit experiment we (or some super-natural entity) would need to somehow “program” each particle to follow a different path determined by a different number, which is absurd. However, the present author has no interest in building a nice deterministic theory compatible with Quantum Mechanics, for the reasons exposed in Section 4.
Note that this deterministic theory is not super-deterministic, i.e. the experimental physicists are free to choose which measurements and which transformations of the state of the system to do [22][23]. However, an experimental procedure involves a symmetry transformation of the state of the system. Since the symmetry transformation in this deterministic theory is reproduced by the pseudo-random number generation, then when we apply the inverse-transform sampling method we need to know already what is the symmetry transformation. Thus there is a kind of conspiracy between the symmetry transformation and the pseudo-random generator, but such conspiracy is part of the definition of the deterministic symmetry transformation itself. There are assumptions about freedom of choice in the literature which exclude our deterministic (but not super-deterministic) theory, because the authors erroneously consider that an experimental procedure which involves a transformation of the state of the system is instead an observation without consequences to the system [22][23].
The ensemble interpretation does not give any explanation as to why it looks like the electron’s wave-function interferes with itself in the Young’s double-slit experiment [24][25][26]—that would imply that the wave-function describes (in some sense) an individual system. We will fill that gap in this section.
The key to understand the results of the double-slit experiment is the role of time in the quantum formalism, which was discussed in detail in Section 12. In the ensemble interpretation the individual system is entirely defined by a standard phase-space, which implies that the time plays no fundamental role in quantum mechanics nor in classical Hamiltonian mechanics. Moreover, the time-evolution in quantum mechanics is not a stochastic process unless it is deterministic. Therefore, there is not a probability distribution for each time (or for other parameter corresponding to the symmetry group).
In the double-slit experiment, the time-evolution of the electron after being fired (S1) is a product of two non-deterministic symmetry transformations: first, going through one or another slit with a 50/50 probability (S2); and second, a non-deterministic propagation from (S2) until (F). If at least one of these two symmetry transformations would be deterministic, then we could define a stochastic process including the 3 instants in time (S1), (S2) and (F). But since both transformations are nondeterministic, the only stochastic process that can be defined only includes the 2 instants in time (S1) and (F), and the corresponding transformations from (S1) to (S2) and from (S2) to (F) have never occurred.
The only “mystery” that needs to be clarified is the fact that the non-deterministic propagation of the electron from (S2) until (F) is such that it appears that the electron interferes with itself, just like a classical wave would do. To simplify the discussion we will only consider the electrons that reach the detector along 2 different angles and where is the electron’s linear momentum. So, a selected electron can only go through one of these 2 angles, the electrons that go through other angles are discarded.
The wave-function at (S1) is . The time-evolution from (S1) until (S2) may be the identity matrix or , depending on whether the second slit is closed or open, respectively. If the second slit is open, then meaning that the electron may go through both slits with equal probability.
The time-evolution from (S2) until (F) is given by the unitary transformation , that is, it sums the wave-functions from both slits for the first angle and it subtracts the wave-functions from both slits for the second angle.
Thus, if the second slit is closed, we have at (F) the wave-function meaning that the electron may come along angles 1 or 2 with equal probability6. But if the second slit is open, we have at (F) the wave-function meaning that the electron will only come along angle 1; since the electron would have come through both slits with equal probability if we would see what happened at (S2), it appears that from (S2) until (F) it interferes with itself constructively(destructively) along the angle 1(2) respectively.
The “mystery” is therefore similar to the probability clock 5: How is it possible that a probability becomes ? It is possible because precisely because the time plays no fundamental role in quantum mechanics nor in classical Hamiltonian mechanics. There is not a probability distribution for each time (or for other parameter corresponding to the symmetry group). The symmetry transformation is different from a stochastic process where the symmetry transformations and then are applied, and there is no reason why it should not be different.
The Bell inequalities [27] do not hold—since Quantum Mechanics cannot be distinguished from a complete statistical theory—because the assumptions of the Bell inequalities overlooked the fact that time-evolution is a stochastic process if and only if it is deterministic. As long as the time-evolution of the phase-space is a symmetry and it respects relativistic causality, there is no reasonable argument why a complete statistical theory should be a stochastic process. The whole point of the Bell inequalities is to distinguish Quantum Mechanics from a “standard” statistical theory, but a “standard” statistical theory means that the theory is completely defined by a probability distribution in a phase-space (which is the case of Quantum Mechanics and classical statistical mechanics).
One could argue instead that the inequalities do hold, but there is an implicit assumption that the theory which is being compared to Quantum Mechanics has a time-evolution which is a stochastic process. Even in that case (see Section 12) we have that for any set of experimental results supporting relativistic Quantum Mechanics, there is a deterministic theory (and so the time-evolution is a stochastic process) which is also compatible with the same experimental results. So, to save the Bell inequalities we would need now to find fundamental arguments against such deterministic theory. But, which arguments? Such deterministic theory is compatible with any experimental test about relativistic causality and it is not super-deterministic. These arguments would need to be somehow against the existence of pseudo-random number generators in Nature, but such generators do exist in Nature because we humans built some of them and we are part of Nature.
To be sure, the present author does not expect that a reasonable deterministic theory will in the future replace Quantum Mechanics. But once it is established that Quantum Mechanics is a complete statistical theory, the idea that we can rule out a reasonable deterministic theory, is also an absurd: it would imply affirming the Bayesian point of view and ruling out the Frequentist point of view. Two logical constructions can always be mutually incompatible, despite being both consistent when considered independently of each other (e.g. the Bayesian and Frequentist points of view). In the Bayesian point of view, the probability expresses a degree of belief, and so the probability is an entity which exists by itself. In the Frequentist point of view the root of probabilities is the absence of deterministic information that does exist somehow and is revealed through events. But if such information exists, then we cannot rule out that there is a reasonable deterministic theory which describes such information.
In summary, either we can say that the Bell inequalities do not hold or instead, we can also say that the Bell inequalities (despite being mathematically valid inequalities) involve unrealistic assumptions which render them innocuous.