Bob Doyle | Harvard University (original) (raw)
Drafts by Bob Doyle
The path information required for microscopic reversibility of particle paths is destroyed or era... more The path information required for microscopic reversibility of particle paths is destroyed or erased by local interactions with radiation and other particles. Ludwig Boltzmann's dynamical H-Theorem (his 1872 Stosszahlansatz) correctly predicts the approach to equilibrium. But this apparent increase in entropy can be reversed, according to Josef Loschmidt's time-reversibility objection and Ernst Zer-melo's recurrence objection. We show that the addition of electromagnetic radiation adds an irreducible element of randomness to atomic and molecular motions, erasing classical path information, just as the addition of a small speck of material can thermalize a non-equilibrium radiation field. Path erasure prevents reversibility and maintains a high entropy state indefinitely. Statistical fluctuations from equilibrium are damped by path erasure. Photon emission and absorption during molecular collisions is shown to destroy nonlocal molecular correlations, justifying Boltzmann's assumption of " molecular chaos " (molekular ungeordnete) as well as Maxwell's earlier assumption that molecular velocities are not correlated. These molecular correlations were retained in Willard Gibbs formulation of entropy. But the microscopic information implicit in classical particle paths (which would be needed to implement Loschmidt's determin-istic motion reversal) is actually erased, justifying what N. G. van Kampen calls a " repeated randomness " assumption. Boltzmann's physical insight was correct that his increased entropy is irreversible. It has been argued that photon interactions can be ignored because radiation is isotropic and thus there is no net momentum transfer to the particles. The radiation distribution, like the distribution of particles, is indeed statistically isotropic, but, as we show, each discrete quantum of angular momentum exchanged during individual photon collisions alters the classical paths sufficiently to destroy molecular velocity correlations. Path erasure is a strong function of temperature, pressure, and the atomic and molecular species of the gas. We calculate path erasure times over a range of conditions , from standard temperature and pressure to the extreme low densities and temperatures of the intergalactic medium. Reversibility is closely related to the maintenance of path information forward in time that is required to assert that physics is deterministic. Indeterministic interactions between matter and radiation erase that path information. The elementary process of the emission of radiation is not time reversible, as first noted by Einstein in 1909. Macroscopic physics is only statistically determined. Macroscopic processes are adequately determined when the the mass m of an object is large compared to the Planck quantum of action h (when there are large numbers of quantum particles). But the information-destroying elementary processes of emission and absorption of radiation ensure that macroscopic processes are not reversible. 2
Papers by Bob Doyle
A LONG-RANGE PROGRAM IN SPACE ASTRONOMY, 1969
The Astronomy Missions Board was established by the National Aeronautics and Space Administration... more The Astronomy Missions Board was established by the National
Aeronautics and Space Administration by charter in September
1967 to assist in an advisory capacity in the planning and conduct
of all NASA missions to create and operate astronomical experiments in space. The scope of the Board’s activities includes:
development and review of the scientific objectives and general
strategy for space astronomy and associated ground-based astronomy; the formulation of guidelines and specific recommendations
for the design of space astronomy missions, and for the various
experiments and auxiliary equipment to be developed and used on
these missions; the continuing examination of policies relating
to the operation of these space observatories once they have been
made operational and are available for observations by the scientific community. The work of the Board encompasses the many
aspects of space astronomy including direct observations of electromagnetic radiation from astronomical sources, cosmic-ray particles and the supporting research that is necessary, but its scope
does not include the study of the Moon and planets from close
vantage point or study of the Earth.
J. Quantitative Spectroscopy and Radiative Transfer, 1969
A quantal calculation of the continuous absorption coefficient of the hydrogen quasi-molecule for... more A quantal calculation of the continuous absorption coefficient of the hydrogen quasi-molecule for the transition 1sσ2sσ 3Σg+ 1sσ2pσ 3Σu+ is described. The calculation includes the explicit dependence of the matrix element of the electronic dipole transition moment on the rotational state of the molecule. The detailed summation of the transition probability over all rotational states for temperatures at which several states are populated differs significantly from the probability given by the contribution of the rotation less (J' = 0) state multiplied by the rotational partition function. The difference is larger than the errors resulting from the delta-function approximation to the continuum wave functions used in previously published calculations of this absorption coefficient.
Is Science Compatible with Free Will?, 2013
Random noise in the neurobiology of animals allows for the generation of alternative possibilitie... more Random noise in the neurobiology of animals allows for the generation of alternative possibilities for action. In lower animals, this shows up as behavioral freedom. Animals are not causally predetermined by prior events going back in a causal chain to the origin of the universe. In higher animals, randomness can be consciously invoked to generate surprising new behaviors. In humans, creative new ideas can be critically evaluated and deliberated. On reflection, options can be rejected and sent back for "second thoughts" before a final responsible decision and action. When the indeterminism is limited to the early stage of a mental decision, the later decision itself can be described as adequately determined. This is called the two-stage model, first the "free" generation of ideas, then an adequately determined evaluation and selection process we call "will."
BioScience, 2012
Page 1. INCOMPLETE NATURE How Mind Emerged from Matter TERRENCE W. DEACON Page 2. INCOMPLETE NATU... more Page 1. INCOMPLETE NATURE How Mind Emerged from Matter TERRENCE W. DEACON Page 2. INCOMPLETE NATURE Page 3. ALSO BY TERRENCE W. DEACON The Symbolic Species Page 4. Incomplete Nature HOW ...
William James Stud, 2010
Research into two-stage models of “free will” – first “free” random generation of alternative pos... more Research into two-stage models of “free will” – first “free” random generation of alternative possibilities, followed by “willed” adequately determined decisions consistent with character, values, and desires – suggests that William James was in 1884 the first of a dozen philosophers and scientists to propose such a two-stage model for free will. We review the later work to establish James’s priority.
By limiting chance to the generation of alternative possibilities, James was the first to overcome the standard two-part argument against free will, i.e., that the will is either determined or random. James gave it elements of both, to establish freedom but preserve responsibility. We show that James was influenced by Darwin’s model of natural selection, as were most recent thinkers with a two-stage model.
In view of James’s famous decision to make his first act of freedom a choice to believe that his will is free, it is most fitting to celebrate James’s priority in the free will debates by naming the two-stage model – first chance, then choice -“Jamesian” free will.
Books by Bob Doyle
My God, He Plays Dice! How Albert Einstein Invented Most Of Quantum Mechanics, 2019
Is it possible that the most famous critic of quantum mechanics actually invented most of its fun... more Is it possible that the most famous critic of quantum mechanics actually
invented most of its fundamentally important concepts?
In his 1905 Brownian motion paper, Einstein quantized matter,
proving the existence of atoms. His light-quantum hypothesis showed
that energy itself comes in particles (photons). He showed energy and
matter are interchangeable, E = mc2. In 1905 Einstein was first to see
nonlocality and instantaneous action-at-a-distance. In 1907 he saw
quantum “jumps” between energy levels in matter, six years before
Bohr postulated them in his atomic model. Einstein saw wave-particle
duality and the “collapse” of the wave in 1909. And in 1916 his transition
probabilities for emission and absorption processes introduced ontological chance when matter and radiation interact, making quantum
mechanics statistical. He discovered the indistinguishability and
odd quantum statistics of elementary particles in 1925 and in 1935
speculated about the nonseparability of interacting identical particles.
It took physicists over twenty years to accept Einstein’s light-quantum.
He explained the relation of particles to waves fifteen years before
Heisenberg matrices and Schrödinger wave functions. He saw
indeterminism ten years before the uncertainty principle. And he saw
nonlocality as early as 1905, presenting it formally in 1927, but was
ignored. In the 1935 Einstein-Podolsky-Rosen paper, he explored nonseparability, which was dubbed “entanglement” by Schrödinger. EPR
has gone from being ignorable to become Einstein’s most cited work:
the basis for today’s “second revolution in quantum mechanics.”
In a radical revision of the history of quantum physics, Bob Doyle
develops Einstein’s idea of objective reality to resolve several of
today’s most puzzling quantum mysteries, including the two-slit
experiment, quantum entanglement, and microscopic irreversibility.
Metaphysics: Problems, Paradoxes, and Puzzles, Solved?, 2016
Great Problems in Philosophy (and Physics) Solved? , 2016
A survey of several popular textbooks on philosophy produces a remarkable consensus on the probl... more A survey of several popular textbooks on philosophy produces a remarkable consensus on the problems facing philosophers from ancient to modern times. They typically include metaphysics - what is there?, the problem of knowledge - how do we know what exists?, the mind/body problem - can an immaterial mind move the material body?, the “hard problem” of consciousness, freedom of the will, theories of ethics - is there an objective universal Good?, and problems from theology - does God exist?, is God responsible for evil?
This book introduces the Information Philosopher website, a work in progress on these classic questions in philosophy that logical positivists and analytic language philosophers thought they could dis-solve as logical puzzles, pseudo-problems, or conceptual errors.
Information philosophy is a new philosophical methodology that
goes “beyond logic and language” to the underlying information structures being created in the cosmos, in the world, in biological information-processing systems, and in the human mind - structures without which logic, language, and science would be impossible.
According to Bob Doyle, it is a scandal that academic philosophers are convincing young students, against their common sense, that mind, consciousness, free will, values, even the external world, do not exist.
To end the scandal, philosophers need to examine a new method of philosophizing, based not on language but on information. The cosmic creation process that formed the galaxies, stars, and planets, that led to life and to the evolution of the information-processing minds that created language and logic, is the process that creates objective value.
Free Will: The Scandal in Philosophy, 2011
John Searle calls it something of a scandal that after all the centuries of writing about free wi... more John Searle calls it something of a scandal that after all the centuries
of writing about free will, we have not made very much progress.
Bob Doyle surveys the centuries, recounting the many different forms
of determinism that have been used to deny human freedom and
responsibility. Even many defenders of free will think that it remains
a metaphysical mystery, one that cannot be simply explained by basing
it on other unintelligible mysteries such as quantum mechanics, or
making it an equally mysterious gift of God.
This book is an introduction to the Freedom section of the
Information Philosopher website, a work in progress on some
classical questions in philosophy that 20th-century logical positivists
and analytic language philosophers dis-solved as pseudo-problems.
Information philosophy is a new philosophical methodology that
goes “beyond logic and language” to the underlying information
structures in the cosmos, in the world, in biological systems, and
in the human mind - structures without which logic, language, and
science would be impossible.
According to Doyle, the more serious scandal today is that academic
philosophers are convincing many young students that they are
biological machines whose actions are completely determined.
To end the scandal, philosophers need to teach a two-stage model of
free will and creativity, one that Doyle finds in the work of a dozen
philosophers and scientists going back to William James in 1884.
Doyle’s Cogito model of the mind treats human beings as an essential
part of a cosmic creation process that creates objective value.
Websites by Bob Doyle
Videos by Bob Doyle
Conference Presentations by Bob Doyle
The path information required for microscopic reversibility of particle paths is destroyed or era... more The path information required for microscopic reversibility of particle paths is destroyed or erased by local interactions with radiation and other particles. Ludwig Boltzmann's dynamical H-Theorem (his 1872 Stosszahlansatz) correctly predicts the approach to equilibrium. But this apparent increase in entropy can be reversed, according to Josef Loschmidt's time-reversibility objection and Ernst Zer-melo's recurrence objection. We show that the addition of electromagnetic radiation adds an irreducible element of randomness to atomic and molecular motions, erasing classical path information, just as the addition of a small speck of material can thermalize a non-equilibrium radiation field. Path erasure prevents reversibility and maintains a high entropy state indefinitely. Statistical fluctuations from equilibrium are damped by path erasure. Photon emission and absorption during molecular collisions is shown to destroy nonlocal molecular correlations, justifying Boltzmann's assumption of " molecular chaos " (molekular ungeordnete) as well as Maxwell's earlier assumption that molecular velocities are not correlated. These molecular correlations were retained in Willard Gibbs formulation of entropy. But the microscopic information implicit in classical particle paths (which would be needed to implement Loschmidt's determin-istic motion reversal) is actually erased, justifying what N. G. van Kampen calls a " repeated randomness " assumption. Boltzmann's physical insight was correct that his increased entropy is irreversible. It has been argued that photon interactions can be ignored because radiation is isotropic and thus there is no net momentum transfer to the particles. The radiation distribution, like the distribution of particles, is indeed statistically isotropic, but, as we show, each discrete quantum of angular momentum exchanged during individual photon collisions alters the classical paths sufficiently to destroy molecular velocity correlations. Path erasure is a strong function of temperature, pressure, and the atomic and molecular species of the gas. We calculate path erasure times over a range of conditions , from standard temperature and pressure to the extreme low densities and temperatures of the intergalactic medium. Reversibility is closely related to the maintenance of path information forward in time that is required to assert that physics is deterministic. Indeterministic interactions between matter and radiation erase that path information. The elementary process of the emission of radiation is not time reversible, as first noted by Einstein in 1909. Macroscopic physics is only statistically determined. Macroscopic processes are adequately determined when the the mass m of an object is large compared to the Planck quantum of action h (when there are large numbers of quantum particles). But the information-destroying elementary processes of emission and absorption of radiation ensure that macroscopic processes are not reversible. 2
A LONG-RANGE PROGRAM IN SPACE ASTRONOMY, 1969
The Astronomy Missions Board was established by the National Aeronautics and Space Administration... more The Astronomy Missions Board was established by the National
Aeronautics and Space Administration by charter in September
1967 to assist in an advisory capacity in the planning and conduct
of all NASA missions to create and operate astronomical experiments in space. The scope of the Board’s activities includes:
development and review of the scientific objectives and general
strategy for space astronomy and associated ground-based astronomy; the formulation of guidelines and specific recommendations
for the design of space astronomy missions, and for the various
experiments and auxiliary equipment to be developed and used on
these missions; the continuing examination of policies relating
to the operation of these space observatories once they have been
made operational and are available for observations by the scientific community. The work of the Board encompasses the many
aspects of space astronomy including direct observations of electromagnetic radiation from astronomical sources, cosmic-ray particles and the supporting research that is necessary, but its scope
does not include the study of the Moon and planets from close
vantage point or study of the Earth.
J. Quantitative Spectroscopy and Radiative Transfer, 1969
A quantal calculation of the continuous absorption coefficient of the hydrogen quasi-molecule for... more A quantal calculation of the continuous absorption coefficient of the hydrogen quasi-molecule for the transition 1sσ2sσ 3Σg+ 1sσ2pσ 3Σu+ is described. The calculation includes the explicit dependence of the matrix element of the electronic dipole transition moment on the rotational state of the molecule. The detailed summation of the transition probability over all rotational states for temperatures at which several states are populated differs significantly from the probability given by the contribution of the rotation less (J' = 0) state multiplied by the rotational partition function. The difference is larger than the errors resulting from the delta-function approximation to the continuum wave functions used in previously published calculations of this absorption coefficient.
Is Science Compatible with Free Will?, 2013
Random noise in the neurobiology of animals allows for the generation of alternative possibilitie... more Random noise in the neurobiology of animals allows for the generation of alternative possibilities for action. In lower animals, this shows up as behavioral freedom. Animals are not causally predetermined by prior events going back in a causal chain to the origin of the universe. In higher animals, randomness can be consciously invoked to generate surprising new behaviors. In humans, creative new ideas can be critically evaluated and deliberated. On reflection, options can be rejected and sent back for "second thoughts" before a final responsible decision and action. When the indeterminism is limited to the early stage of a mental decision, the later decision itself can be described as adequately determined. This is called the two-stage model, first the "free" generation of ideas, then an adequately determined evaluation and selection process we call "will."
BioScience, 2012
Page 1. INCOMPLETE NATURE How Mind Emerged from Matter TERRENCE W. DEACON Page 2. INCOMPLETE NATU... more Page 1. INCOMPLETE NATURE How Mind Emerged from Matter TERRENCE W. DEACON Page 2. INCOMPLETE NATURE Page 3. ALSO BY TERRENCE W. DEACON The Symbolic Species Page 4. Incomplete Nature HOW ...
William James Stud, 2010
Research into two-stage models of “free will” – first “free” random generation of alternative pos... more Research into two-stage models of “free will” – first “free” random generation of alternative possibilities, followed by “willed” adequately determined decisions consistent with character, values, and desires – suggests that William James was in 1884 the first of a dozen philosophers and scientists to propose such a two-stage model for free will. We review the later work to establish James’s priority.
By limiting chance to the generation of alternative possibilities, James was the first to overcome the standard two-part argument against free will, i.e., that the will is either determined or random. James gave it elements of both, to establish freedom but preserve responsibility. We show that James was influenced by Darwin’s model of natural selection, as were most recent thinkers with a two-stage model.
In view of James’s famous decision to make his first act of freedom a choice to believe that his will is free, it is most fitting to celebrate James’s priority in the free will debates by naming the two-stage model – first chance, then choice -“Jamesian” free will.
My God, He Plays Dice! How Albert Einstein Invented Most Of Quantum Mechanics, 2019
Is it possible that the most famous critic of quantum mechanics actually invented most of its fun... more Is it possible that the most famous critic of quantum mechanics actually
invented most of its fundamentally important concepts?
In his 1905 Brownian motion paper, Einstein quantized matter,
proving the existence of atoms. His light-quantum hypothesis showed
that energy itself comes in particles (photons). He showed energy and
matter are interchangeable, E = mc2. In 1905 Einstein was first to see
nonlocality and instantaneous action-at-a-distance. In 1907 he saw
quantum “jumps” between energy levels in matter, six years before
Bohr postulated them in his atomic model. Einstein saw wave-particle
duality and the “collapse” of the wave in 1909. And in 1916 his transition
probabilities for emission and absorption processes introduced ontological chance when matter and radiation interact, making quantum
mechanics statistical. He discovered the indistinguishability and
odd quantum statistics of elementary particles in 1925 and in 1935
speculated about the nonseparability of interacting identical particles.
It took physicists over twenty years to accept Einstein’s light-quantum.
He explained the relation of particles to waves fifteen years before
Heisenberg matrices and Schrödinger wave functions. He saw
indeterminism ten years before the uncertainty principle. And he saw
nonlocality as early as 1905, presenting it formally in 1927, but was
ignored. In the 1935 Einstein-Podolsky-Rosen paper, he explored nonseparability, which was dubbed “entanglement” by Schrödinger. EPR
has gone from being ignorable to become Einstein’s most cited work:
the basis for today’s “second revolution in quantum mechanics.”
In a radical revision of the history of quantum physics, Bob Doyle
develops Einstein’s idea of objective reality to resolve several of
today’s most puzzling quantum mysteries, including the two-slit
experiment, quantum entanglement, and microscopic irreversibility.
Metaphysics: Problems, Paradoxes, and Puzzles, Solved?, 2016
Great Problems in Philosophy (and Physics) Solved? , 2016
A survey of several popular textbooks on philosophy produces a remarkable consensus on the probl... more A survey of several popular textbooks on philosophy produces a remarkable consensus on the problems facing philosophers from ancient to modern times. They typically include metaphysics - what is there?, the problem of knowledge - how do we know what exists?, the mind/body problem - can an immaterial mind move the material body?, the “hard problem” of consciousness, freedom of the will, theories of ethics - is there an objective universal Good?, and problems from theology - does God exist?, is God responsible for evil?
This book introduces the Information Philosopher website, a work in progress on these classic questions in philosophy that logical positivists and analytic language philosophers thought they could dis-solve as logical puzzles, pseudo-problems, or conceptual errors.
Information philosophy is a new philosophical methodology that
goes “beyond logic and language” to the underlying information structures being created in the cosmos, in the world, in biological information-processing systems, and in the human mind - structures without which logic, language, and science would be impossible.
According to Bob Doyle, it is a scandal that academic philosophers are convincing young students, against their common sense, that mind, consciousness, free will, values, even the external world, do not exist.
To end the scandal, philosophers need to examine a new method of philosophizing, based not on language but on information. The cosmic creation process that formed the galaxies, stars, and planets, that led to life and to the evolution of the information-processing minds that created language and logic, is the process that creates objective value.
Free Will: The Scandal in Philosophy, 2011
John Searle calls it something of a scandal that after all the centuries of writing about free wi... more John Searle calls it something of a scandal that after all the centuries
of writing about free will, we have not made very much progress.
Bob Doyle surveys the centuries, recounting the many different forms
of determinism that have been used to deny human freedom and
responsibility. Even many defenders of free will think that it remains
a metaphysical mystery, one that cannot be simply explained by basing
it on other unintelligible mysteries such as quantum mechanics, or
making it an equally mysterious gift of God.
This book is an introduction to the Freedom section of the
Information Philosopher website, a work in progress on some
classical questions in philosophy that 20th-century logical positivists
and analytic language philosophers dis-solved as pseudo-problems.
Information philosophy is a new philosophical methodology that
goes “beyond logic and language” to the underlying information
structures in the cosmos, in the world, in biological systems, and
in the human mind - structures without which logic, language, and
science would be impossible.
According to Doyle, the more serious scandal today is that academic
philosophers are convincing many young students that they are
biological machines whose actions are completely determined.
To end the scandal, philosophers need to teach a two-stage model of
free will and creativity, one that Doyle finds in the work of a dozen
philosophers and scientists going back to William James in 1884.
Doyle’s Cogito model of the mind treats human beings as an essential
part of a cosmic creation process that creates objective value.