In the last post I tried to unpack Thomas Aquinas’s Fifth Way of reasoning to the existence of God by the existence of consistent cause and effect (seeing teleology or final causation as just as real in the world as efficient causation). Of course, it’s not a proof, or if it is it’s one that doesn’t compel skeptics, which amounts to the same thing. But it is powerful, time-honoured and has never been refuted. It may surprise some, as it surprised me, that the Fifth Way makes allowance for chance as evidence for this aspect of causation.
To quote Aquinas scholar Ed Feser on this:
Chance presupposes a background of causal factors which themselves neither have anything to do with chance nor can plausibly be accounted for without reference to final causality, so that it would be incoherent to suppose that an appeal to chance might somehow eliminate the need to appeal to final causality.
That is really to say, in other words, what I said about the clear distinction that needs to be made between chance, which exists in our world, and chaos (in its original meaning) which does not. Aquinas’s understanding of chance, of course, precedes the study of probability.
The kind of example of chance he gave would be the equivalent of two planets proceeding in their orbits in strict conformity to the laws of gravity and motion (ie displaying the final causality or teleology inherent in those laws) and colliding at some remote date as their orbits happen to cross. This happenstance would not in any way undermine the Fifth Way: the laws still govern what “always or nearly always” happens to “natural bodies lacking intelligence”, and in that sense the collision of the planets is fortuitous and rare, not built into the teleology of those laws and so not “intended by nature.” Aquinas would, nevertheless, say that such events are governed by God’s providence, but that’s another discussion.
But with the study of probability, it becomes possible to say not only that chance does not oppose the Fifth Way, but that it reinforces it, rendering chance a source of the metaphysical evidence for the intelligent design of the universe.
One can quantify stochastic occurrences like the planetary collision above in the probablistic examination of a simple “chance” event like a coin toss, as an instance of the clear “intelligent design” of a random event in the world. In a recent post and in the ensuing discussion I suggested that tossing a coin is actually a highly-designed method of turning physically determined (non-random) events into a “random number” generator.
Briefly to recapitulate, research has shown that a carefully made and set-up coin-tossing machine can predict the outcome every time by applying exactly the same forces in the same way. In other words, there are no forces that could be regarded as truly random within the process, such as quantum variations or chaotic (scientific usage in this case) events.
However, in an actual, human, coin toss it is impossible to control either the initial conditions or the force of the toss exactly. Additionally, it appears there are usually two axes to the toss, and also if the coin is allowed to bounce on the ground, a significant “chaotic” (though physically deterministic) element is indeed introduced. The net result is that the coin can make its final landing at any angle. But the coin itself, as a flat disc, is designed and made to resolve all angles into just two stable alternatives.
And so we have designed a system to harness the various teleological forces we either create, or find in nature, to provide a probability distribution between heads and tails of 50:50, barring flukes like the one illustrated. The key point here is that the overall statistical ratio of “random” coin tosses of 1:1 is evidence of the teleology of the system (in this case owing to human design), in exactly the same way as that of the coin tossing machine that consistently gives the same result. The clue is in the probability distribution – a truly fortuitous process could have no such mathematical description.
Now the same considerations can be applied in a natural system, such as the laws of statistical mechanics that follow on from the work of Maxwell. The individual motions of molecules in a gas cannot be known in detail (though they seem to be primarily classical and therefore actually governed by teleological laws). From the point of view of a human observer, though, they appear random. But this is of no consequence whatsoever to the higher level statistical laws that apply macroscopically. If a statistical law like Boyle’s law applies consistently to gases, then whatever unknowable things may be happening to the molecules, Aquinas’s original criteria still apply:
Things which lack intelligence, such as natural bodies, act for an end, and this is evident from their acting always, or nearly always, in the same way, so as to obtain the best result.
So although the molecular movement may be seen as random, the very existence of Boyle’s law, mathematically relating pressure to volume, is as much a metaphysical argument for the existence of God as the laws of motion and, specifically, for his final causation of the behaviour of gases. This is no great surprise, any more than it would surprise one for an artist, painting a portrait, to fill a block of colour by “random” brush strokes.
Considering a far more complex natural example, like the “randomization” involved in the mammalian immune system’s responsiveness to pretty well any pathogen, the same things are true. The system as a whole “always or nearly always” acts to the end of attacking pathogens, and is therefore comparable to any other instance of final causation, even if one were ignorant of the specific efficient causes involved in the “randomization” process or its development. Once could, however, be pretty sure that there were law-like processes involved even in that.
As a final example one can consider quantum events, for which no efficient physical causation is (on most interpretations) permitted. The Fifth Way has no need to speculate on what makes an individual quantum event happen, because the statistical picture demonstrates that it is not fortuitous, but designed. How so?
Take the example of radioactive decay, governed by quantum events in unstable nuclei. There is no way of knowing when any one atomic nucleus will decay, but nevertheless the overall process is quantifiable – the half life of any one isotope can be determined accurately enough to act as the basis for absolute geological chronology. For all that the efficient cause of each quantum event is unknowable, half of them will certainly occur in each half-life period. In a real sense it is clear that the quantum events are determined by the final causation of the process of radioactive decay, because en masse the particles will obey the mathematical law that describes that teleological process – quantum events are not fortuitous, or they would not be amenable to statistical study.
That should, in itself, demonstrate the necessity of including final causation in science (efficient causes being opaque in quantum events). Indeed, like final causes generally, they are included (unconsciously), or “quantum statistical mechanics” would just be crazy talk about predicting the results of random events, of chaos reliably forming patterns.
But it is also more evidence that chance, rigorously examined, has nothing to do with fortuitous causation, but with the quantifiable results of physical laws that are intrinsically teleological and therefore point to intelligent causation (with all that Aquinas concludes in that), unless one could finally refute the Fifth Way.