Chance in a theology of nature

An article sent to me by Eddie Robinson (forwarded from another scholar) gives me an occasion for commenting on chance in a more or less appropriate place in the loose series I’m developing on a theology of nature. This article is The Secularization of Chance: Toward Understanding the Impact of the Probability Revolution on Christian Belief in Divine Providence by Josh Reeves (available here, but only if you’re registered).

The basic thesis is that modern knowledge about probability has revealed ontological randomness in the universe, and therefore necessitates a revision of the traditional doctrines of providence: another case of science being supposed to have overturned naive historical Christian teaching:

Abstract. The modern ability to quantify chance has transformed ideas about the universe and human nature, separating Christians today from their predecessors, but has received little attention by Christian historians and theologians.

The article has received enough attention to be cited by Karl Giberson, the Open Theist Evolutionary Creationist, either because it reinforces his view of God or because it is the “evidence” that leads him to it. But apparently it has not been visible enough to be refuted by others for its perpetuation of the myth of probability as a cause, rather than an effect, to which I’ve drawn repeated attention before, and will no doubt have to again.

He starts with an historical survey of the biblical and theological treatment of chance and providence, citing Augustine, Aquinas and John Calvin – though he could have got the same high view of providence from the Didache in the First Century or Arminius after Calvin. Reeves goes on to say why it all has to change after the nineteenth century and the “emergence of ontological chance” as a supposed reality:

Most scientists had assumed the laws of nature applied precisely and universally, and so once all the laws and forces of nature were discovered then physicists could predict the motions, at least in principle, of the material particles of which the universe consisted. The world is deterministic in classical physics because it acts in definite, predictable ways with no alternative outcomes.

The classical picture gave way over the nineteenth and twentieth centuries, with statistical mechanics as the “bridgehead” (Gigerenzer et al. 1989, 222). James Clerk Maxwell and Ludwig Boltzman applied to molecules the same statistical techniques sociologists applied to human societies, allowing them to characterize the behavior of the system without a complete description of the physical state.

The death blow to the classical picture, however, was quantum mechanics, which suggested probability was a basic feature of the universe to most physicists. Particles of atomic or subatomic size do not act like the idealized billiard ball of classical mechanics.

In the orthodox interpretation of quantum mechanics, particle location is indeterminate; the act of measurement forces the particle into a determinate location, the probability of which can be predicted by physicists. Instead of the deterministic, mechanistic world of classical physics, modern physics appears to give a probabilistic one (Polkinghorne 2002, 25).

This neglects a number of important points. As regards statistical mechanics, Reeves passes over what Maxwell knew, that the inability to provide “a complete description of the physical state” was a statement about human uncertainty, not ontology. Not only did Maxwell believe the individual movements of molecules are determined by physical laws, but he at least once referred to such movements as the unknown individual decisions of God:

Would it not be more profound and feasible to determine the general constraints within which the deity must act than to track each event the divine will enacts?

To the Christian Maxwell, the statistical laws were an abstraction from lawlike and contingent reality both, and not a cause of it.

Regarding quantum laws, apart from the inaccurate description of uncertainty Reeves gives in the quote, my last piece on Heisenberg should demonstrate that to him, quantum indeterminism was an artifact of the error in that very belief in the “idealized billiard ball of classical mechanics.” Under the despised potentia of Aristotelian natures – ousted to make way for classical “billiard ball” physics – the dilemma, if not the uniqueness of the quantum level, largely disappears. And contrary to Reeves’s suggestion, Aristotelian metaphysics is just as comfortable with statistics as the mechanical philosophy is, only without the need to invoke the highly suspect idea of ontological chance. And there are some clues in the article itself that should give this away. For example, in his section of the impact of probability on Christian belief:

Stable regularities in the social realm raise moral questions. As Daston explains, “How could the suicide of say, Goethe’s young Werther really be his own decision, if the suicide rates remained constant for decades on end?” (2008, 8).

Something ought to strike every reader as decidedly surreal here – the implication is that, multo mysterioso, it was the suicide rate that determined Werther Goethe’s suicide, rather than his choice. How, exactly, is that supposed to work? Is suicide imposed on people by a probability law, and attempts at prevention doomed since the statistical rate is a given? Any fool should see that the problem has been stated back to front: in fact, the complex true causes of suicide, culminating in a disordered choice, may in large populations be seen to result in some pattern. This same blindness is shown even more obviously in an earlier passage, where Aristotle’s “failures” are discussed:

The mechanical worldview helped philosophers to see that the world runs through laws that could be characterized mathematically and to look for underlying regularities in nature. Whereas Aristotelian science encouraged philosophers to perceive each object in nature as having its own essential nature, the mechanical philosophy encouraged many to see the natural world as composed of homogenous matter. This conceptual change is needed for probability; to collect statistics of human societies requires the “belief in the existence of homogeneous categories of people to which the regularities apply” (Daston 2008, 7).

Once again, the implication is that “hidden laws” turn real people, making choices for real and demonstrable reasons – such as following fashions, bucking trends, obeying reason – into “homogeneous categories of people.” Whereas the truth is the reverse – because real choices are made by real people for intelligible reasons, en masse they may be reduced to intelligible mathematical abstractions that tell us what choices are more common in a particular context.

The “statistical laws” have no more determinative role on human behaviour than randomness determines your daily decisions at work. Which is proved by the fact that as soon as you alter actual causes – let’s say a politician turns out to be a rogue – then the statistical pattern of voting will change. Only a complete dummkopf would believe that an alteration in ontological chance made the politician do dishonest things. Yet they willingly believe it somehow turns people into statistical automata. A kind of phlogiston science is being perpetutated.

Now, if the topsy-turvy nature of this interpretation is obvious when dealing with people, is it not pretty likely that the same mistake has been made in interpreting randomness in irrational nature too? And was not the initial mistake that of the early modern scientists in viewing laws of nature as mysterious immaterial entities governing the actions of inert billiard balls, they know not how? Dispensing with Aristotle’s natural powers, they started by equating regularities in nature with obedience to such laws imposed by a rational God – and end up now by positing irrationally causal statistical laws which are, nevertheless no less mystical and equally prescriptive.

“Statistical laws” are, indeed, a highly mystical entity: if it’s hard to see how abstract laws determine inert matter, it’s pretty magical to consider other laws that direct matter to do A 70% of the time, B 20% 0f the time and C for the other 10%. This smacks more of a suggestion of nature, rather than a law of nature: “Thou shalt not commit murder, usually.”

The theological outcome is summarised in the article:

The last section will discuss the impact of the probability revolution on Christian belief in providence. Modernity undoubtedly changed perceptions of God and divine providence; Sung-Sup Kim says in his recent book on Providence, “We are living in a world . . . where it has become increasingly difficult to suppose divine providence” (2014, 1).

Reeves says that Kim cites as causes Darwin, Marx and Freud (none except the first much in vogue now, and even him now fading), the closed causal order of the universe (scarcely tenable now if was ever more than a tabu) and the awareness of evil in a post-Holocaust world (neglecting that Christian belief in providence arose in a post-destruction-of-Jerusalem context, when over a million Jews died). But Reeves prefers to these the argument from probability and ontological chance which, he seems to suppose, must force one away from the orthodox belief in universal providence:

If God’s actions mirror the outcomes of chance in a vast majority of cases, why believe that each event in the world results from the special intervention of God?

Oh dear, oh dear, oh dear oh, as my three year old granddaughter says. We have already seen that human populations, acting rationally, at all times form statistical distributions by their decisions. I have pointed in previous articles to the fact that any large body of writing in a particular language will show a clear statistical pattern of letter frequency, and yet that probability distribution does nothing whatever to limit what you may write about.

Once freed from the incoherent spectre of ontological chance, one is able to see that the reason God’s providence, on the large scale, “mirrors the outcomes of chance” is that the probability distributions are entirely the statistical result of God’s contingent providential actions, and not of chance at all (where they are not the patterns of creaturely choices or of unknown lawlike causes). Like humans making choices, all God has to do is act naturally, and a probability distribution will result.

To say God’s providence resembles chance is, in reality, to say that chance is a proxy, in abstracted form, for God’s activity. If when you measure that probability distribution you call it “ontological chance”, you are simply choosing an Epicurean conception of the Universe, instead of the Christian one. And woe betide you if you seek to impose the speculations of Epicurus on the work of the eternal Logos, whose every action in the world is rational and purposeful, as befits his title.

In this particular case, surely Tertullian was right when he said, “What has Athens to do with Jerusalem?”

Jon Garvey

About Jon Garvey

Training in medicine (which was my career), social psychology and theology. Interests in most things, but especially the science-faith interface. The rest of my time, though, is spent writing, playing and recording music.
This entry was posted in Creation, Philosophy, Science, Theology, Theology of nature. Bookmark the permalink.

Leave a Reply