Gobbledegook from hobbledehoys

I found this article at the Daily Sceptic intriguing. The author, unimpressed with a university (UCL) “vision statement” that looked as if it had been cobbled together from buzzwords by AI, decided to use a commercially available AI program to construct his own.

Here’s UCL’s “vision”:

Our distinctive approach to research, education and innovation will further inspire our community of staff, students and partners to transform how the world is understood, how knowledge is created and shared and the way that global problems are solved.

The results of asking the AI program to construct a short paragraph using the buzzwords “vision, distinctive, research, education, innovation, inspire, community , students, partners, transform, world, knowledge, global,” were two offerings hard to distinguish from the original. The grammar was perfect, and the content meaningless – a combination that perhaps is diagnostic of AI compositions of many kinds.

But what provoked my thinking more (suggesting I am not an AI bot after all) was the author’s suggestion that “chatGPT can do better with better input.” When he set the program the same task using the buzzwords “truth, reason, verifiable, university,” he obtained something any lover of truth would applaud:

A university education is built on the pursuit of truth through reason and verifiable evidence. Students are encouraged to think critically, ask questions, and seek out reliable sources of information as they work to expand their knowledge and understanding of the world. By fostering a culture of impartial investigation and the use of reason, universities play a crucial role in helping students discover the truth and prepare for lives as informed and engaged members of society.

The problem, for me, is that in terms of intent this is just as vacuous. What the AI has done is to absorb masses of human texts which, parrot-like, it can mimic and even permutate, according to the task it is asked to perform. That’s why political bias has been found in AI programs used to monitor social media platforms: the combination of selective “reading” and biased instructions guarantees the flavour of the outcomes.

So it is no surprise that a “creative writing” app, given some worthy values like “truth,” “reason,” and so on to work with, will mine the stuff it has read that verbalises those values, and produce something that seems an expression of free speech and intellectual pursuit. But training a parrot to say “God save the King” rather than “Heil Hitler” does not make it a British patriot. Neither will it have the least idea, or interest, in why one slogan is superior to the other – unless, that is, its training was restricted to British or Nazi propaganda.


The sobering thought I had was that although in all probability UCL’s “vision statement” was produced not by computer, but by real, ensouled, human beings made in God’s image, they were functioning indistinguishably from AI. They have read many of the same texts as chatGPT, seen which way the sociological wind is blowing, and taken their “programming” from the spirit of the age. Hence their words are no more meaningful than those of Polly the Norwegian Blue.

This kind of unthinking thinking reminds me of when GP Fundholding became the Thatcherite template for family medicine in 1991. To participate, we were required to prove our entrepreneurial credentials by supplying a practice “mission statement.” Now, since Beauchamp House Surgery was a committed Christian practice, founded by a missionary returning from China in 1947, we actually had a mission, though we hadn’t necessarily formulated it conceptually into a slogan. But I suggested, therefore, something along the lines of “Embodying Christ’s wholeness though medical excellence.”

It was rejected by the Authorities, of course, and was replaced with something bland and meaningless that I have long forgotten, if I ever memorised it. I do remember thinking, after seeing similar slogans on plumbers’ vans, hospital letterheads and (whilst they were corporately fashionable) most other flat surfaces, that one universal mission statement would save everyone a lot of work: “Working Together for Change.”

This is inspiring, motivating, descriptive and, naturally, completely devoid of actual meaning. If John F. Kennedy was indeed assassinated by a combination of political opponents, deep-state security services, organised crime and the military-indiuustrial complex, then each conspiratorial component, and the conspiracy as a whole, could happily endorse “Working Together for Change.” And indeed, so could the Kennedy administration itself.

The sad truth is that much human “opinion” is as mindless as AI, soaking up the environment of other people’s chatter, mixing it around, and regurgitating it at parties, in parliament – or even in churches. And that is what makes propaganda so effective. You have only to block dissent over COVID on Twitter and everyone will enthuse about lockdowns. Banning Russian media early last year guaranteed that most people were quite certain that they had come to a rational conclusion that Putin was a dying dictator losing massively after an unprovoked invasion of democratic Ukraine.

But this is in stark contrast to actual human thought. The first, and most obvious, ability distinguishing us from chatGPT is the power of disagreement. If we’re so motivated, we can not only talk about “the pursuit of truth through reason and verifiable evidence,” but we can do it, despite the limitations of having read the same guff as AI reads. If you tell AI to construct a paragraph using buzzwords, it can’t reply “That seems a pointless exercise.”

Yet what distinguishes human creativity from AI (or UCL) parrot speech is not simply the rejection of all that we have not originated ourselves. The difference is that we do not simply recycle all the influences around us, but we upgrade them in the process, and that upgrading is goal-orientated, that is to say teleological.

So a computer program can absorb the last fifty years of hit singles and generate “a pop song.” Maybe it will even sell a few copies, especially if it is advertised as being by a bot, thus generating the kind of novelty interest Dr Johnson spoke of in relation to dogs walking on their hind legs. But John Lennon or Paul McCartney, soaking up the same musical heritage (but with some classical stuff, music hall and much else the AI missed) will be inspired by those influences to do something within the same genre, but with an injection of magic that only humanness possesses.

That might be Lennon trying to write a Roy Orbison ballad and ending up with She Loves You, or McCartney strumming away in the studio in A on the same old 12-bar chords and ending up with Get Back. Part of the process is trying to discover not only how the song should go, but what it is about. For example, there is a beautiful moment in the studio footage of the creation of Get Back when Paul is trying to improve on “Jo-Jo left his home in Arizona…” and tries inserting a couple of town names. As he tries “Tucson, Arizona,” John gives him a glance that says, “That’s the line.” And indeed, where else could Jo-Jo possibly come from?

More broadly, even the purpose of the song is something discovered, more than it is planned. Paul has the line “Get Back” from the beginning, but in working out the rest he at first sees it as a protest song against telling black immigrants to go home. But eventually, the situation is flipped on its head, to the need to return to one’s true roots. And I suspect it is that universal human longing, plus some musical magic dust, that makes the song great half a century later.

There is a strong sense amongst many creative people – not just artists, but scientists, theologians and bathroom-fitters – that the goal they are trying to achieve is somehow already “out there,” if only they can tap into it. Even the ordinary Christian engaging with, perhaps, a set liturgy and an even more set Bible, is looking to see how these can form their own destiny, and make them useful people in the world. They know that the Scriptures and the liturgy were set down by their human or divine authors with particular goals in mind, into which as rational and spiritual beings they can tap. They can become part of an eternal story, as genuine playing characters furthering the plot.

But if AI, or humans reducing themselves to bots, simply reflect the tides of opinion surrounding them there is no real story – that is a narrative with a teleological end in view – and the whole of life, to quote Shakespeare’s Macbeth, becomes “a tale, told by an idiot, full of sound and fury, signifying nothing.”

Only with today’s AI (machine or human) you don’t even get the sound and fury, but mere empty platitudes.

Avatar photo

About Jon Garvey

Training in medicine (which was my career), social psychology and theology. Interests in most things, but especially the science-faith interface. The rest of my time, though, is spent writing, playing and recording music.
This entry was posted in Music, Philosophy, Politics and sociology, Theology. Bookmark the permalink.

Leave a Reply