Essays,  Philosophy,  Science

Fuzziness is all

"As far as the propositions of mathematics refer to reality, they are not certain; and as far as they are certain, they do not refer to reality."
-Albert Einstein, Geometry and Experience, 1921

 Einstein

Einstein's quantum photelectric effect. Attrib: Feitscherg, CC-BY-3.0.

 

Determinism vs. quantum mechanics

Alongside Newton’s powerful physical model of the universe came a growing belief that the universe in principle was deterministic, that the rules by which the universe behaved could be discovered and modeled, were repeatable, and could be in principle exactly or absolutely determined. Absolute determinism came under serious question with the advent of subatomic physics at the start of the 20th century, more or less collapsing in the face of problems insoluble with the physics of Newton and Maxwell, and only explicable by using the new quantum mechanics, which posits that natural phenomena could be modeled at the highest attainable precision only by using explicitly probabilistic models, that is, by building into the models a modicum of fuzziness.

Determinism and Newton.

 Newton

Newton's Law of Universal Gravitation. Attrib: User:Dna-Dennis, CC-BY-3.0.

 

My first forays into physics in my teenage years brought me face to face with the certainties of Newtonian mechanics, which described among other things, dynamical relations between forces and objects in no uncertain terms, as exemplified by the Law of Universal Gravitation. When we as students tried to carefully verify these august relations with simple experiments, measuring phenomena with crude instruments like rulers and stopwatches, we invariably found that our results differed from those predicted by Newton, often by a jarringly wide amount. Our teacher helped us to understand that there were limits to the accuracy of our measuring instruments, limits to our eye’s ability to ascertain the precise alignment of our instruments, flaws in the construction or calibration of our instruments, small but significant delays between when our mind told us to stop the stopwatch and when we finally were able to do so, and so on. Our teacher sometimes talked about “systematic” and “random” errors as further obstacles in obtaining experimental agreement with our Newtonian equations of motions and forces.

The emphasis was typically on “idealized” models that asymptotically approached the measured behavior under limited and simplified conditions. The Ideal Gas Law, PV = nRT, for example got more accurate the lower the temperature T, the smaller the number of gas particles n, and the more elastic the container in which the gas was confined. That is, as the particles slowed down to minimal kinetic energy, and the number of particles approached zero, there would be almost no complicating interactions between the particles or the walls of the container, and the simple law would describe more accurately the state of the gas. This understanding was underpinned by the aforementioned concept of determinism, which suggested that if the initial conditions of a dynamic system, in this discussion, were completely and precisely known, then the state of the system could be completely and accurately known for any time in the future, that is, fully determined. It would seem that complete accuracy came only when you were reduced to nothing to measure.

Enter probability

  -Oregon Scribbler, .

Oregon Scribbler.

 


 Schrödinger wave equation -Oregon Scribbler, .

Schrödinger wave equation. Oregon Scribbler.

 

"The uncertainty principle describes an inherent fuzziness that must exist in any attempt to describe nature. Our most precise description of nature must be in terms of probabilities."
-Richard Feynman, Lectures on Physics, I, 6-10

Later in our science education, as we moved into the sub-atomic world, we were introduced to the quantum mechanical models of physical phenomena. These models were described at root as non-deterministic, based entirely on probabilistic models and constrained to a specific lower limit of accuracy, as described by the Heisenberg Uncertainty Principle. The Schrödinger wave equation was used to calculate the probabilities of the particular observable phenomena occurring, like the position or momentum of an electron.

Experiments are inexact

Regardless of the chosen point of view, it would seem that natural phenomena can only be modeled at best to the limits of the means of measurement. Those frustratingly imprecise student experiments turned out to be more real epistemologically than the idea that the models we were testing were in principle absolute and deterministic. All of this begs some basic questions:  At what time and in what place has a perfectly accurate measurement been made, has a perfect instrument been created that is even capable of such thing, whether the model being tested is considered deterministic or probabilistic? Any less than perfect measurement unavoidably introduces experimental error, which must be evaluated on probabilistic terms. So whether a model is described as deterministic or as probabilistic, all measurements of observed phenomena have some uncertainty, and the fit of those observations to what the model predicted must be done on a probabilistic basis.

Dynamic models are not absolute

 Infinitesimal calculus: Successively smaller rectangles bring greater accuracy. Infinite rectangles bring perfect accuracy. -CC-BY-SA-NC, Kalid Azad.

Infinitesimal calculus: Successively smaller rectangles bring greater accuracy. Infinite rectangles bring perfect accuracy.. Attrib: Kalid Azad, CC-BY-SA-NC.

 

This problem of inexactness extends to the very mathematical models employed by science. These models of physical phenomena are constructed largely from the calculus, whether the underlying model is described as deterministic, like Newton’s, Maxwell’s and Einstein’s, or as probabilistic in modern quantum mechanics. The very foundations of the calculus require an infinite number of incremental steps to achieve a perfect fit to a curve or an area; anything less than that infinite number produces less than a perfectly accurate fit. Since infinities are physically impossible to achieve or measure, any model constructed with the calculus cannot be absolutely determined.

Charles van Doren captures the essence of this point: "The solution of an integration or of a differential equation is never absolutely accurate, but it can always be made as accurate as you please. ... In dealing with the physical world, mathematics gives up the absolute precision that it enjoys in pure mathematical spaces, in elementary geometrical proofs, for example, where circles are absolutely circular, lines absolutely straight, and so forth. Reality is always slightly fuzzy. ... The ability to adjust its accuracy is the reason why Newton's calculus works so well in the macrocosm. The tiny inherent inaccuracy of calculus, never perfectly accurate, causes difficulties when you are dealing with the tiny world of atoms, nuclei, and nuclear particles, where solutions may be wide of the mark." (A History of Knowledge, pp 347-48) 

Causation is elusive

"The rejection of determinism is in no sense an abdication of scientific method. It is rather the fruition of a scientific method which had grown up under the shelter of the old causal method and has now been found to have a wider range. It has greatly increased the power and precision of the mathematical theory of observed phenomena. ... The denial of determinism, or as it is often called 'the law of causality', does not mean that it is denied that effects may proceed from causes."
-New Pathways in Science, The Decline of Determinism, Arthur Eddington, p. 65

Some historians of science and philosophers of science have bemoaned the loss of determinism almost as some loss of possibility, but practicing scientists always operate probabilistically, even if they have at times chosen a conceptual framework of deterministic certainty, so this shift to probabilistic physical models is not ultimately new to anyone in science. Scientific theories are generally abstractions which describe a reality that is mostly beyond direct experience and which approach some ultimately incomplete level of understanding. The developers of quantum mechanics made great strides in part because they, reluctantly, abandoned the pretense of deep explanations of causation, and focused on mathematically modeling observable and measurable phenomena.  This approach has been extraordinarily fruitful; QED, the theory of quantum electrodynamics, which describes electronic behavior, produces by far the most precise and accurate predictions  of all major physical theories.

Conceptual models that provide some level of explanation of the cause of physical phenomena, at any rate, all share the limitation that they are bottomless. Newton understood, for example, that his Law of Universal Gravitation provided a very accurate model of the behavior of two bodies acting on each other with a mutually attractive force of gravity, but himself acknowledged that he had no idea what produced the force of gravity. No matter what level of explanation is provided, it can always be legitimately asked: If B causes A, then what causes B? Physicists plumbing the depths of the material universe understand this, and have to be content with a philosophy of knowledge that is limited not only in what is measurable but in what can be conceptually modeled.

Principle of Tolerance

It seems fair to say that our knowledge of the physical world will always be less than absolute due to inherent limitations in our ability to observe, instrument, measure, conceive, model and communicate. Jacob Bronowski suggested that "We cannot ask the world to be exact. ... The Principle of Uncertainty is a bad name. In science or outside it, we are not uncertain; our knowledge is merely confined within a certain tolerance. We should call it the Principle of Tolerance."(The Ascent of Man, pp 277-78)  It would seem that our predictive models of the material world have always been probabilistically based, whether the probabilities were built into the models, or determined after the fact by analysis of the inevitable errors or limits of measurements, or both. Bronowski’s Principle of Tolerance conceptually rephrases the Uncertainty Principle to suggest that any conclusions about nature can only occur within the tolerances, or limits, of deviations from an absolute fit to abstract mathematical models, reflecting the reality that even theoretical physics is bounded by engineering practicalities.

Determinism as a Faustian bargain

 Faust mit Homunculus, Mephistopheles -PD-US, .

Faust mit Homunculus, Mephistopheles. PD-US.

 

Perhaps determinism is a philosophical sleight-of-hand, an illusion of the possible attainment of complete understanding, or a deep and irrational hope for more understanding than is evidently or evidentiarily possible.  Goethe’s Faust started his dance with the Devil after abandoning his quest to understand everything and pursuing magic as the means to attain infinite knowledge. Substituting absolute determinism for magic in Faust’s story does not change its dynamics, as neither provides an ultimate pathway to understanding.

Accepting Goethe’s depiction of Faust as a cautionary tale of the consequences of overreaching can be a liberating impulse: Acknowledging the inherent fuzziness of knowledge clears the mind of the unattainable and opens it to useful possibilities within attainable limits.  Yet these limits are themselves fuzzy. Knowledge remains a humbling and tantalizing puzzle for which curious souls will continue to attempt solutions.

Leave a Reply

Your email address will not be published. Required fields are marked *