Update to the text in the Home page (Start)
There is no real-world engineering without tolerances and managing uncertainty. Yet, the mathematical methods (such as differential geometry) used in engineering are mostly based in deterministic logic and variables, due to computability[1]. Whenever we deal with a non-separable space, there are issues with computability because some of the elements of a non-separable space cannot be approximated by a finite set of elements. And no one has found yet a systematic way to define a separable probability space with an arbitrary probability measure over real functions of real variables (and infinite-dimensional spaces in general). While there are separable spaces of real functions of real variables, whenever we add uncertainties/probabilities to such spaces we tend to create non-separable spaces[2]. As a consequence, there is a wide variety of ad hoc methods (which include Bayesian inference with priors which make ad hoc assumptions) to manage uncertainty in real-world engineering.
In the era of big data, the ad hoc methods which were inefficient before, cannot compete with specialized expertise developed by trial-and-error in a case-by-case basis (essentially capital intensive brute-force black boxes). This is unsustainable, since not only the crucial decisions are made exclusively by machines, but also these decisions cannot be reviewed and audited by humans (not even by the few rich human owners of these black boxes) because the mathematical language needed to communicate the reasoning behind the decisions made by the machines does not exist[3]. Paradoxically, in the era of big data, humans may have much less power and know much less about their world, “the devil is in the details” may be replaced by “the devil is in the machine”.
The potential applications of mathematical methods for engineering which are based in probabilities and random variables (instead of deterministic logic and variables) cannot be overstated. For instance, neural networks are particularly good as a computer approximation of a vector in a separable Hilbert space[4]. Thus, neural networks and deep learning are not necessarily an alternative to Bayesian Inference and can instead be used as part of Bayesian Inference[5], specially when the posterior and prior probabilities can be parametrized by vectors in a Hilbert space.
A consequence of the lack of mathematical methods for engineering which are based in probabilities and random variables, is that Quantum Yang-Mills theory, Classical Statistical Field Theory (for dissipative systems, e.g. Navier-Sokes, and for Hamiltonians which are non-polynomial in the fields, e.g. General relativistic statistical mechanics) and Quantum Gravity all suffer from severe mathematical inconsistencies and produce unreliable predictions at best.
While the Lebesgue measure cannot be defined in an Euclidean-like infinite-dimensional space[6][7], it is well known since many decades that a uniform (Lebesgue-like) measure of an infinite-dimensional sphere can be defined using the Gaussian measure and the Fock-space (the Fock-space is a separable Hilbert space used in the second quantization of free quantum fields)[8]. Such a space can parametrize (we call it the free field parametrization) the probability distribution of another probability distribution with sample space given by the direct product of the base space and the field (say
Thus, infinite-dimensions are not by themselves problematic. The problem is how to implement exact constraints in a separable probability space without attributing to the constrained space null probability measure. This is a much deeper problem that also affects statistical mechanics[9] and machine learning[10] (in spaces of finite dimensions).
A simple solution to this problem exists because the existence of a probability measure allows us to define a quantum Hamiltonian and quantum constraints, which are more general than the classical ones because they do not need to commute with the variables defining the sample space. As we will see, quantum constraints often allow us to implement exact constraints in a separable probability space without attributing to the constrained space null probability measure.
As a simple example of a motion which cannot be derived from a Lagrangian (and thus also not from a classical Hamiltonian without enlarging the system) we can cite[11] the oscillations of two classical coupled oscillators with different frequencies and different damping constants:
where
The quantum formalism is the most general formalism whenever there is a conserved probability, thus it is expected that it can be applied even when classical Hamiltonians (due to energy dissipation[11], or non-linear dynamics[12]) or classical exact constraints (due to the fact that the constrained space has null probability measure) cannot.
Timepiece is about the mathematical inconsistencies of statistical field theory and its (unavoidable) consequences for the foundations of Quantum Mechanics: either the probability space is separable and then there are implications for foundations because quantum methods must be used in classical cases, or the probability space is not separable and then there are also implications for foundations because a non-separable probability space has compatibility issues with computability[1].
Moreover, if every quantum system is made of fundamental Particles, then Quantum Mechanics must at least be consistent with Quantum Field Theory. Moreover, in General Relativity the diffeomorphism transformations are time-dependent transformations and many of the predictions in Gravity are produced with the Hamiltonian formalism; however the symplectic form of conservative classical mechanics is not invariant under time-dependent transformations: the most obvious way to make non-relativistic mechanics consistent with General Relativity is to formulate it as a particular field theory whose phase-space is a fibred manifold over a “time” axis [13].
The name Timepiece refers to the above mentioned extra “time” axis (defined by the phase-space), appearing in a (quantum or classical) Hamiltonian formalism with time-dependent transformations, which is complementary to the standard time (defined by the time-evolution operator) of any Hamiltonian formalism.
Note that in any dynamical system there is (see Wikipedia):
a state which is a point in a state space (no time involved, the state is the present state). It may be simply a set, without the need of a smooth space-time structure defined on it. Usually we know what the state space is.
an evolution rule, which says how a new (future) state is generated from the present state. This rule may be non-deterministic and the notion of “time'“ is determined by the evolution rule itself, it is not a pre-established concept. “Time'“ can be measured by integers, by real or complex numbers or it can be a more general algebraic object, losing the memory of its physical origin.
Usually it is in the evolution rule that the discoveries happen. In the classical Lagrangian formalism, time is a pre-established concept which may be in conflict with the evolution rule. That is why the definition of a dynamical system is very old and more general.
First of all, the above mentioned problem (the lack of mathematical methods for engineering which are based in probabilities and random variables) is a hell of a problem. As discussed above, we believe that there is a simple solution to it, but nevertheless such simple solution could be considered the holy grail of High Energy Physics (specially to produce predictions for experiments), with important applications to Data Science in general (because Statistical Field Theory involves probability measures in infinite-dimensional spaces). This fact already justifies that we explore all available means to solve this problem, which includes community science.
But it turns out that community science is particularly well-suited to solve this problem. Because all the scientific articles in the main physics (and mathematics) journals are required to be in one of two categories (or in both):
Articles without implications to the interpretations of Quantum Mechanics;
Articles which do not study in a systematic way the mathematical inconsistencies of Field Theory.
As it was discussed above, the mathematical inconsistencies of statistical field theory have unavoidable consequences for the foundations of Quantum Mechanics. This implies that a scientific article addressing the subject of the Timepiece community in practice has no place in the main physics journals. Surely, one author can suppress or play with the words and try to fit one article in one of the two categories. But this is not a problem that can be completely solved by one author in just one article, this is a problem that must be addressed by a scientific community persistently for a few years.
These two categories make it impossible for such a scientific community to use the main physics journals.
The most obvious alternative is community science, specially if the solution to this very difficult problem is relatively simple, as we believe. Moreover, a technological ingredient which only appeared in 2011[14] is crucial to community science.
But now we address why these two categories exist in the first place. The ground reason is that the problem (the lack of mathematical methods for engineering which are based in probabilities and random variables) is very difficult and it was persistent since the end of World War II. Another reason is that if to solve the problem we would need quantum methods applied to statistical field theory in a way that challenges the mainstream interpretations of quantum mechanics (as argued above), these two categories would work as a self-fulfilling prophecy: by excluding simple solutions to the problem, then only complex solutions could be found which then justifies the existence of the categories. Moreover, a crucial technological ingredient only appeared in 2011[14], without it most of the results presented here perhaps would not be possible. So there were five to six decades during which a strong scientific consensus built that in order for the scientific articles to be useful they cannot address head-on this very difficult problem. In some sense, other problems must be solved before such difficult problem can be successfully addressed. This was in some sense true until 2011[14], but in the meantime the technical foundation of far too many of the main scientific programs in theoretical physics (from string theory to quantum information) relies on the assumption that there will be no simple solution to this difficult problem soon.
Think about it: why would anyone study String Theory (which is full of conjectures) when there is a relatively simple mathematical definition of Field Theory for Hamiltonians which are non-polynomial in the fields? Why would anyone follow ad hoc ambiguous unification principles when designing hugely expensive High Energy Physics experiments, when there are relatively simple mathematical definitions for Quantum Yang-Mills theory and Quantum Gravity which are mutually consistent? Why would anyone differentiate between Quantum and Classical Information when this is inconsistent with the mathematical definitions of Quantum and Classical Statistical Field Theory? We cannot ask any human being to help us when his paycheck and of all of his colleagues is on the line, to happily consider the hypothesis that he allowed far too many years and far too many resources to be invested studying a subject that is worth next to nothing while a much simpler solution existed.
In Scientific Research as in Bayesian inference, there is always a prior probability distribution, and there is no prior which is better for all cases[15]: we always have to make assumptions. For instance, if we insist we can assume all objects in the night sky revolve around the Earth using epicycles and even consider that this is a very successful theory because it was a precursor of Fourier series and these have widespread applications today[16].
The Emperor realized that the people were right but could not admit to
that. He though it better to continue the procession under the illusion that
anyone who couldn’t see his clothes was either stupid or incompetent. And he
stood stiffly on his carriage, while behind him a page held his imaginary
mantle.—-Hans Christian Andersen (1837)
Thus, data, evidence and even mathematical proofs are not enough for us to abandon prior beliefs, there must be a moment when in order for us to achieve something we want we have to abandon our prior beliefs: in the case of assuming all objects in the night sky revolve around the Earth, it would make many relevant calculations in modern astronomy or in the global positioning system impossible.
In the case of Quantum Mechanics, it was always possible to see its mathematical formalism as a mere (but very useful) parametrization of probability and information theories. But, it was also possible to describe it (although in a forceful way) as a new and exotic alternative to (or generalization of) classical information theory, one that shook our sense of reality. It is not hard to believe that the mystery surrounding the exotic point of view can often convince the society to spend large amounts of money in research, at least in the short term.
The exotic point of view was sustained by a relatively constant flow of discoveries of new phenomena at higher and higher energies, which kept the mystery alive. But now, even optimists say that if the Large Hadron Collider finds nothing new, it will be harder to convince the governments of the world to build the next bigger, more expensive collider to keep research in High Energy Physics as we know it[17]. Thus, if the exotic point of view is not that mysterious nowadays and there exists a parametrization point of view, is there another way to convince the society to spend large amounts of money in research?
The answer is yes, there is another way. Because the parametrization point of view allows the mathematical formalism of Quantum Mechanics to be applied to probability and classical information theory (and then to artificial intelligence) not just to edge cases, but to the core of these theories. This would not be possible with the exotic point of view, which considered that Quantum Mechanics shook our sense of reality and thus must be an alternative to (or a generalization of) classical information theory and thus there would be no reason to expect applications to the core of classical information theory.
There are many potential applications beyond physics. For instance, since the supervised learning problems in machine learning can be thought of as learning a function from examples, these problems can be formulated as Bayesian modeling[18]. The free field parametrization may allow using any prior as an alternative to using the Gaussian process [18] or the conditional random field [19] as priors. It may also allow Random Fields in its full generality as an alternative to Markov Random Fields in Physics, Biology and Data Science [20], alternative to the semivariogram in spatial Data analysis [21], alternative to wavelets and renormalization in multi-resolution analysis [22] and alternative to Functional Data Analysis and Neural Networks in time series analysis [23]. Complex systems [24], sequential systems in machine learning [25] and Quantum Reinforcement Learning both in computer simulations and as a model for human decision-making [26], all share several features with statistical or quantum physics; (more) applications of quantum methods in these subjects thus seems straightforward.
Learn more in the Collections.