**Quantum Mechanics as Generalised Theory of Probabilities***Michel Bitbol*First, quantum mechanics is not a physical theory that happens to make use of probability calculus; it is

*itself* a generalised form of probability calculus, doubled by a procedure of evaluation that is probabilistic by way of its controlled usage of symmetries. Secondly, quantum mechanics does not have merely a predictive function like other physical theories; it

*consists* in a formalisation of the conditions of possibility of any prediction bearing upon phenomena whose circumstances of detection are also conditions of

*production*.

[...]

What conditions permitted the collective elaboration, from the seventeenth century onward, of probability calculus? Ian Hacking has furnished an extensive list of such conditions [

*The Emergence of Probability*], but he insists upon one in particular. This crucial condition is the development, in the sixteenth century, of sciences of

*signs* or of

*secondary qualities*.

[...]

As Heisenberg wrote, quantum physics confronts a situation where

*even* spatio-kinematic variables of the position and quantity of movement, which were considered at the time of Descartes and Locke as direct and 'primary', must be taken as indirect manifestations, relative to an instrumental context—in short, as secondary.

[...]

If man must content himself, according to Pascal, with 'perceiving some appearances from the middle of things, in an eternal despair of knowing either their beginning or their end', he cannot denigrate the appearances in favour of an ungraspable backworld governed by principles. Man must learn to inhabit his milieu; he must know how to focus his attention upon the play of his experimental manipulations and the phenomena that result from them; he must admit the inconsistency of cutting up the world into separate and intrinsically-existing objects, since phenomena are so tied one to another that it is impossible to know how to grasp one without grasping all; he must understand, also, that no cognition can free itself from the nexus of interrelations, but can only situate itself within it, remaining cognizant of the perspective from which it derives.

[...]

The image of the perturbation of the object by the measuring agent [...] begins by bringing into play a universe of objects endowed with primary spatial and kinematic qualities, and then invoking their mutual alteration so as to subsequently justify the putting aside of the concept of primary quality and the generalisation of that of secondary quality. In this image, then, one puts forward the representation of a universe of figures and movements, with the unique aim of demonstrating its inanity, or (what comes down to the same thing, for a verificationist epistemology) its in-principle inaccessibility.

[...]

In hidden variable theories, the determinist stance does indeed seem to have been lost, even at the level of its epistemological fecundity. The determinist stance was only fruitful because it compelled researchers to conceive of networks of univocal bonds underlying phenomena, to design the type of experiment that would allow these bonds to be brought to light, and to thus define often unprecedented classes of phenomena. [...] Once the reciprocal current of information between the determinist project and the definition of new domains of experimentation dries up, the attempt to pursue this project formally becomes nothing more than a jue d'esprit whose principle (if not sole) interest is its serving as an intellectual stimulant for specialists in the foundations of modern physics.

This situation does not justify, for all that, the inverse excess—namely, indeterminist dogmatism. All one is within one's rights to observe is that henceforth, in the physical sciences, the advantage of epistemological fruitfulness will belong to the stance that consists in maximally developing predictive capacity to the detriment of descriptive ambition, the calculus of probabilities rather than determinist models of evolution.

It is true that many thinkers do not stop there; they tend to extrapolate the epistemological observation of the fecundity of the indeterminist option into an ontological affirmation of the intrinsically stochastic character of the laws governing the world. But their position is easily acceptable on the methodological plane, without it being necessary to follow them in the metaphysical aspects of their conclusions.

[...]

To

*each* experimental context is associated a scale of possible determinations and a scale of attributive propositions which belong to a classical, boolean, sublogic; and to each determination chosen from the set of possible determinations corresponding to

*a* given context, can be attached a real number that obeys Kolmogorov's axioms of probability. But these sublogics and these probabilistic substructures cannot be

*fused together*, for they depend on distinct contexts that cannot, in general, be conjoined. Under such conditions, we seek to

*articulate* them with each other, respectively in the framework of a metalogic and a metacontextual probabilistic formalism. What is remarkable is that when one constructs such a metalogic, in taking account

*only* of the impossibility of conjoining the diverse scales of possibilities, one arrives at structures isomorphic with the celebrated nondistributive 'quantum logic' of Birkhoff and von Neumann.

[...]

In its function as a theory-framework, quantum mechanics is consequently nothing less than

*a metacontextual form of probability theory*. It brings together the conditions of possibility of a

*unified* system of probabilistic prediction bearing upon phenomena inseparable from sometimes incompatible contexts.

[...]

In truth,

*none* of the epistemological constraints exerted by the standard quantum mechanics of 1926 have been relaxed by contemporary varieties of quantum theory, and that new constraints of a similar order have even been added to them. Whatever representations they may give rise to, current quantum theories

*always* operate as generalised, metacontextual instruments of probabilistic prediction. And this stems from the fact that they are

*always* confronted with phenomena inseparable from their context of manifestation.

[...]

As to state vectors in Fock space, they allow not only the calculation of the probability that this or that 'property' of a particle will manifest itself in a given experimental context, but the probability that a certain

*number* of particles will be detected under the appropriate instrumental conditions. This number itself is treated as an

*observable*, the set of whose possible values

*under appropriate conditions of detection* is identified with the set of whole natural numbers. To the contextualisation of the

*predicate* of objects typical of standard quantum mechanics, then, quantum field theory adds the contextualisation of the nation of the denumerable

*bearers* of predicates.

That one must from now on hold the very concept of 'particles', and not only that of 'properties of a particle', to be relative to a context of manifestation, is rendered particularly evident by the relativistic phenomenon of so-called 'Rindler particles'.

[...]

Every quantum theory contains an invariable element—a metacontextual form of probability theory—and a variable element—a set of symmetries. [...] As soon as one accepts that there is nothing more to be understood in quantum mechanics, a whole world of non-physical applications of the theory opens up, in game theory, perception theory, or linguistics. Conversely, the very success of these exotic applications testifies that quantum mechanics is indeed in its essence a metacontextual form of probability theory.

http://www.urbanomic.com/pub_collapse8.php