The discussion on this thread, with Lou Jost about the human particularity of reason (or the lack thereof) and with GD on the varying degrees of epistemological certainty within science, set me thinking about how in practice it’s impossible to wall off kinds of knowledge that, in theory, are quite distinct.
The original post was about the weakness of all demarcation criteria for what “science” is, or should be, and about the consequent legitimacy, in principle, of natural theology. As Steven Meyer has pointed out (based on pretty standard philosophy of science, his original field) pretty well any criterion of “science v non-science” breaks down at some point. Usually that happens when you stop comparing theoretical physics with fairies and enquire about psychology (the example I used with GD), archaeology or political science. Or as GD pointed out, even about less some substantiated aspects of biology.
But in this post I want to explore not blurred boundaries, but the leakage that occurs, inevitably and probably in the final analysis beneficially, between epistemological categories that nobody disputes to be distinct. In looking for examples, I wanted to include the less “settled” (or more “speculative”) aspects of science to which GD drew attention. But I wanted to avoid the historical sciences like palaeobiology in which the disputes tend to be between philosophical or religious attitudes to a hidden past. I’ll come to them later.
I should point out in passing that even GD’s “settled science” is settled only to the extent that it encompasses the most deeply-seated worldview commitments, and these can and have altered in the past, and might do so again. For example, the mathematical certainty of physics is only as secure as belief in the objective reality of mathematics, which as I’ve discussed in the past is open to dispute. Likewise modern science depends on a conviction that, suitably aided and controlled, our senses give us enough sense of reality for our theories to “save the appearances” lying atop the realities. Other philosophical foundations (eg the Vedantic maya) doubt this presumption. Nevertheless, here I’m considering only our own common worldview, and what ordinary western people regard as sources of knowledge.
So as a hypothetical example, I’m using an idea based on the Iraq war, where total Iraqi casualties were disputed because different scientific models were giving very different estimates. The details don’t matter (and I’ve forgotten them anyway). But let’s postulate a more localised claim about an alleged, imaginary, atrocity somewhere in this troubled world.
We’ll suppose there’s a repeated claim that a dictator or a terrorist army slaughtered an entire ethnic or religious community living within a certain city. Later on, the city was heavily fought over, many potential witnesses killed and the population scattered across the world. But the initial atrocity is the subject of a war-crimes trial, if it can be proven. But there are virtually no eye-witnesses, and in any case their reports tend to be predictably loaded by their ethnic and religious affiliations. Science must come to the rescue.
OK, so one theory, based on the evidence of official death records and what graves have been found, suggests that if it occurred at all, there were only a moderate number of deaths, with no clear indication that a minority was targeted, and therefore no case to take before a war-trial. It has the advantage of using fairly hard evidence.
The second theory is based on a computer model of the post-conflict deficit of population of the relevant minority, compared to the proportion in comparable cities in the country, using an estimate of the numbers in the city there before the war (for which, unfortunately, there are no accurate figures). This suggests an horrendous genocide – but is much “softer” and more model-based science.
Now (owing to an administrative bungle!) you find yourself on the UN panel set up to adjudicate between these two very disparate scientific theories: do you decide for or against prosecution? On the basis of the methodological quality of the research, you’re inclined towards the first, no-genocide theory. But your brother-in-law, a reliable guy, happened to be out doing relief work during the events in question. And he tells you, “I know what the scientists say, but I remember the butchery I witnessed, and I saw the football stadium full of bodies before they spirited them away, and there must have been ten thousand or more.”
Let’s take for granted the obvious caveats from our everyday life and say that under ordinary circumstances you’d never have reason to doubt this guy’s accurate reportage or motivations. My question then is, would your belief in his personal testimony not incline you towards the theory that was more compatible with it, even though personal testimony is entirely extraneous to the science?
I suggest that, whatever protestations might be made about leaving personal considerations behind at the lab door, your acceptance of your relative’s testimony would inevitably, and rightly, colour how you weighted the contradictory scientific evidence. If it wouldn’t, people would consider you an odd fish indeed. Remember here your relative is not making a claim against “settled science”, such as that he visited an island where time goes backwards. This is ordinary, not-entirely-settled science being influenced by knowledge sources outside science, such as personal testimony.
Does such a mingling of epistemological sources happen? One would be a fool to think it doesn’t. Stephen Jay Gould’s Marxism is written all over his science, and what’s more, being a candid observer, he engaged in self-mockery about it. His 1972 punctuated equilibria theory of evolution, he said in an essay, perhaps appealed to him because unlike Darwin’s time, this was an age when revolution was in the air rather than uniformity. I suspect he would have said there was no coincidence in the fact that the universal rejection of Wegener’s 1912 theory of continental drift began to reverse after a paper published in the same year as the Paris student riots, 1968.
In fact, mixed epistemology is vital for scientific progress. I’ve recorded before how Einstein “knew” relativity must be true as a teenager, long before there was any evidence available to him and before he discovered the maths to describe it. Michael Polanyi considered such insights the norm, rather than the exception, for new theories.
Now, turning to religion, back in 2013 Mike Gene recycled a survey of élite non-religious academics. Very few, in fact, gave science-based reasons for their unbelief, the strongest being the rather circular and question-begging, “There is no scientific evidence for God.” More often they gave answers such as that they had never been religious and were uninterested (eg zoologist Gabriel Horn, physicist [of quantum robin fame] Jim al Khalili); lack of personal experience of God (geneticist Ken Edwards); aesthetic judgement that the universe is more wonderful without God (Craig Venter); loss of faith during a philosophy course (anthropologist Melvin Konner); family background (physicist Peter Higgs); put off by priests (anthropologist Jonathan Parry); the problem of evil (geriatrician Raymond Tallis).
Now, some or all of these may be rational reasons for unbelief, but they are irrelevant to science (though often used by scientists arguing publicly against religion). Would one expect the resulting worldviews, though, to have no influence at all on their scientific lives? Most of the above are not militantly anti-religious, though Al Khalili is President of the British Humanist Association, whose website is perfectly happy to major on his scientific credentials (it’s hard to think why else he was elected President – few working physicists get to be Archbishop of Canterbury).
Geneticist Steve Jones, whose own “testimony” is based on the old nineteenth century claim of incompatibility between science and faith is also on Mike Gene’s list, and his spokesmanship for atheism, using his status as a scientist in support, is well-known and can be seen at the website of the campaigning British Secular Society, of which he is an “honorary associate” together with other outspoken secularists like Peter Atkins, Richard Dawkins and Lawrence Krauss (and my old physiology lecturer Colin Blakemore). It is very clear from their associations and their writing that these guys do not leave their non-scientific beliefs at the lab door. And I’m very glad of it, because it confirms that scientific epistemology can never, in real life, be insulated from the other sources of knowledge and belief that make us human. There are no such neat firewalls around ideas.
Science cannot be seen as a source of truth, but only as a more-or-less successful human search for truth of a particular, limited, kind. If that were not so, “leakage” from other epistemological sources would be unnecessary, and furthermore alternative scientific theories would not compete so violently as they do – reason would decide all peacefully.
For scientists to have beliefs is a good thing, whether those are religious, anti-religious, political, philosophical or whatever. But it needs to be recognised that those beliefs will, inevitably, influence preferences for competing scientific theories. They form an irreducibly subjective aspect to science. Declare the interest and that’s not a problem.
What’s more problematic is that such beliefs also affect what one regards as being science at all. That too is inevitable, since the boundaries of science cannot be determined by scientific methods, but by external considerations such as sociology, philosophy and metaphysics. But it’s less desirable because these areas are all matters of legitimate disagreement amongst interested parties. Many scientists are blissfully unaware that they have sociological, philosophical and metaphysical biases – a good many deny they are even valid categories. To have biases is the human condition, but to be oblivious to them is culpable.
I fear that, in some cases, that ends up meaning that the legitimate content of science is decided by no more rational a process than the most strident spokesmen saying, “It’s my game, so I make up the rules.”