QUANTUM MECHANICS AS GENERALIZED THEORY OF PROBABILITIES (original) (raw)
A detailed interpretation of probability, and its link with quantum mechanics
Eprint Arxiv 1011 6331, 2010
In the following we revisit the frequency interpretation of probability of Richard von Mises, in order to bring the essential implicit notions in focus. Following von Mises, we argue that probability can only be defined for events that can be repeated in similar conditions, and that exhibit 'frequency stabilization'. The central idea of the present article is that the mentioned 'conditions' should be well-defined and 'partitioned'. More precisely, we will divide probabilistic systems into object, environment, and probing subsystem, and show that such partitioning allows to solve a wide variety of classic paradoxes of probability theory. As a corollary, we arrive at the surprising conclusion that at least one central idea of the orthodox interpretation of quantum mechanics is a direct consequence of the meaning of probability. More precisely, the idea that the "observer influences the quantum system" is obvious if one realizes that quantum systems are probabilistic systems; it holds for all probabilistic systems, whether quantum or classical.
Probabilistic theories: what is special about Quantum Mechanics
2008
Quantum Mechanics (QM) is a very special probabilistic theory, yet we don't know which operational principles make it so. All axiomatization attempts suffer at least one postulate of a mathematical nature. Here I will analyze the possibility of deriving QM as the mathematical representation of a "fair operational framework", i.e. a set of rules which allows the experimenter to make predictions on future "events" on the basis of suitable "tests", e.g. without interference from uncontrollable sources. Two postulates need to be satisfied by any fair operational framework: NSF: "no-signaling from the future"--for the possibility of making predictions on the basis of past tests; PFAITH: "existence of a preparationally faithful state"--for the possibility of preparing any state and calibrating any test. I will show that all theories satisfying NSF admit a C*-algebra representation of events as linear transformations of effects. Based on a very general notion of dynamical independence, it is easy to see that all such probabilistic theories are "non-signaling without interaction" ("non-signaling" for short)--another requirement for a fair operational framework. Postulate PFAITH then implies the "local observability principle", the tensor-product structure for the linear spaces of states and effects, the impossibility of bit commitment and additional features, such an operational definition of transpose, a scalar product for effects, weak-selfduality of the theory, and more. Dual to Postulate PFAITH an analogous postulate for effects would give additional quantum features, such as teleportation. However, all possible consequences of these postulates still need to be investigated, and it is not clear yet if we can derive QM from the present postulates only. [CONTINUES on manuscript]
2019
It is proved that in non-collapse quantum mechanics the state of a closed system can always be expressed as a superposition of states all of which describe histories that conform to Born’s probability rule. This theorem allows one to see the probabilities in non-collapse quantum mechanics as a prediction made by the theory, and renders non-collapse quantum mechanics with the same predictive power as standard quantum mechanics with collapse according to Born’s rule. By adding the remark that collapse quantum mechanics is logically compatible with probabilities different from those given by Born’s rule, it is argued that the fact that the experimental observations support Born’s probability rule can be seen as evidence in support of the non-collapse interpretation of quantum mechanics, rather than as a problem for that interpretation. This remark should also be used to scrutinize derivations of Born’s rule in the context of collapse and of non-collapse quantum mechanics.
A purely probabilistic approach to quantum measurement and collapse
Advanced Studies in Theoretical Physics, 2021
The probability theory presents various interpretations frequentist, subjective, axiomatic, Bayesian, logical etc. which exhibit profound diversities, while the definitive comprehensive theoretical frame does not appear at the horizon. The quantum universe is intrinsically probabilistic, and several times Karl Popper underscored the importance of the probability fundamentals as requisite to quantum mechanics (QM). This research has followed Popper’s lesson. The first step went through the probability foundations, here the second step illustrates how those theoretical results apply to QM. In particular, this paper is arranged as follows, Section 2 recalls some definitions and theorems about probability that we have published. Section 3 derives the definitions of particles and waves from the concept of probabilistic outcome. Section 4 puts forward a new scheme about the wave collapse and the measurement problem. Section 5 discusses physical experiments supporting the achievements infe...
Mathematics
The link between classical and quantum theories is discussed in terms of extensional and intensional viewpoints. The paper aims to bring evidence that classical and quantum probabilities are related by intensionalization, which means that by abandoning sets from classical probability one should obtain quantum theory. Unlike the extensional concept of a set, the intensional probability is attributed to the quantum ensemble, which is contextually dependent. The contextuality offers a consistent realization of the measurement problem, which should require the existence of the time operator. The time continuum by Brouwer has satisfied such a requirement, which makes it fundamental to mathematical physics. The statistical model it provides has been proven tremendously useful in a variety of applications.
From probabilistic mechanics to quantum theory
Quantum Studies: Mathematics and Foundations, 2019
We show that quantum theory (QT) is a substructure of classical probabilistic physics. The central quantity of the classical theory is Hamilton's function, which determines canonical equations, a corresponding flow, and a Liouville equation for a probability density. We extend this theory in two respects: (1) The same structure is defined for arbitrary observables. Thus, we have all of the above entities generated not only by Hamilton's function but also by every observable. (2) We introduce for each observable a phase space function representing the classical action. This is a redundant quantity in a classical context but indispensable for the transition to QT. The basic equations of the resulting theory take a "quantum-like" form, which allows for a simple derivation of QT by means of a projection to configuration space reported previously [Quantum Stud Math Found 5:219-227, 2018]. We obtain the most important relations of QT, namely the form of operators, Schrödinger's equation, eigenvalue equations, commutation relations, expectation values, and Born's rule. Implications for the interpretation of QT are discussed, as well as an alternative projection method allowing for a derivation of spin.
Structure of Probabilistic Information and Quantum Laws
2001
The acquisition and representation of basic experimental information under the probabilistic paradigm is analysed. The multinomial probability distribution is identified as governing all scientific data collection, at least in principle. For this distribution there exist unique random variables, whose standard deviation becomes asymptotically invariant of physical conditions. Representing all information by means of such random variables gives the quantum mechanical probability amplitude and a real alternative. For predictions, the linear evolution law (Schrödinger or Dirac equation) turns out to be the only way to extend the invariance property of the standard deviation to the predicted quantities. This indicates that quantum theory originates in the structure of gaining pure, probabilistic information, without any mechanical underpinning.