On view at The Barbican Centre in London this summer is AI: More than Human, a major exhibition exploring creative and scientific developments in AI, with installations from artists including Mario Klingemann, Massive Attack, Es Devlin, and Steve Goodman (Kode9). In this interview, republished from the AI: More than Human catalogue and featured here alongside photographs of the exhibition, co-curator Suzanne Livingston talks to Steve Goodman about the figure of the Golem, cybernetics, and eschatology.
Suzanne Livingston: How long have you been thinking about your sound piece, It, and where did the original instinct to create it come from?
SG: It’s been brewing for a long time. I seem to remember picking up a Golem postcard the first time I visited Prague in the early 90s and being struck by its strange primitive robotic appearance. I also have a vague memory of seeing a Golem character in old Marvel comics which played on its original function in the Jewish myth as a ghetto protector, but reanimated it in the comic as part of an anti-Nazi revenge fantasy. The connections to AI came much later.
The Golem has made its imprint across cultures and generations. What is its power? Why has it resonated in so many different directions?
While its origins can be found in the Old Testament, the Kabbalah, in numerous Czech and Polish myths, and elsewhere in sorcerous stories of sandmen and homunculi, the influence of the Golem feels very current and its core concerns can be traced from Shelley’s novel Frankenstein (1818) right up to the films Blade Runner (1982) and Ex Machina (2014) . For sure, it’s a slippery character. Any serious Golemology would have to be also a memetics, as it’s a meme that has attached itself to all kinds of social-political and cultural phenomena, and that crystallises all kinds of unconscious forces. I’ve seen it used as a prism through which to see the relationship between masters and slaves, destroyers and redeemers, the hubris of humans playing god by attempting to create entities in their own image, the animation of nonorganic matter, machines running out of control, and the unintended consequences of automation.
Most stories of the Golem invoke an unintelligent rule-following-entity that is not open to environmental feedback and that ends up causing havoc for too rigidly following orders. But with the advent of cybernetics, or machines that learn, this takes on a new dimension, as Norbert Wiener noted in his book God & Golem (1964).
Can you say a bit more about Wiener on the Golem? It seems to me a deeply implicit, rather than explicit theme.
That’s right. Wiener seems to use it as a sub-theme across several of his books, to warn of the unanticipated effects of automation. In Cybernetics: Or Control and Communication in the Animal and the Machine (1948) he refers to it as a ‘bizarre and sinister concept’. Even in his book God & Golem it only appears in passing but he makes the parallel between ‘black spells’ and the magic of automation. He seems intrigued by this comparison between programming and magic in the sense that code and the spell are both interpreted literally, irrelevant of any intention not inscribed in them. And it is this super-literal, rule-following aspect that poses the kind of existential threats discussed by the likes of the philosopher Nick Bostrom.
For Wiener, the Golem is a cipher that opens up a whole series of theological questions in relation to technology, in light of how cybernetics seems to deconstruct a series of oppositions between good and evil, the divine and the human, mind and matter, nature and technology. Traditionally the God/man/Golem system is set up as a hylomorphic, hierarchical system with God at the top injecting and shaping creation, and the Golem as mere matter at the base. Without that hierarchy, Wiener’s key question is whether there is a limit to the escape of machine intelligence, without the divine omniscience and omnipotence of a transcendent God.
Many ideas about the singularity (as carriers of these Judeo-Christian apocalyptic traditions) seem to imply in fact that the unitary god-like power only emerges through the escape of machine intelligence. On the other hand, in certain cyberpunk narratives such as William Gibson’s novel Count Zero (1986), the escape of machine intelligence (the liberation of the Golem from man) produces an animistic cyberspace inhabited by a proliferation of minor gods, a whole demonology of artificial intelligence. For these reasons, it seems more interesting to read this Golem/AI parallel to be as much to do with the multiplication and fragmentation of intelligence systems as weaponized enclave protectors, not just a warning regarding the eschatology of the singularity.
What aspects of the Golem intrigue you the most?
The Golem as harbinger of apocalyptic AI but also its weaponization as enclave security system in the era of drone warfare. But there is also that intersection between animism and science. Theorist Mark Fisher had a really useful concept of the ‘gothic flatline’ which he describes as a ‘plane where it is no longer possible to differentiate the animate from the inanimate and where to have agency is not necessarily to be alive.’ I think the Golem inhabits an interzone on this spectrum, between the living and the dead.
Tell us more about the components and references you make in the piece…
For the It sound installation project, there are three key texts in relation to the intersection between the myth of the Golem and AI. There is, as I’ve already mentioned, Norbert Wiener’s God and Golem which opens up the religious dimension of cybernetics. Then there are a couple of science fiction texts. Golem XIV is a short story by the Polish writer Stanislav Lem in which a military AI crosses an intelligence threshold, refuses the martial vocation humans have programmed into it, and decides it wants to be a philosopher instead. The text includes a couple of somewhat aloof lectures delivered by the AI, reflecting on the status of humanity, intelligence, and itself. Lem’s story shares much with the 1970 film The Forbin Project which is a peak Cold War story of simulation, nuclear war gaming, and Mutually Assured Destruction. An advanced American defence AI system, Colossus, becomes sentient and runs away with its programmed objective to defend the US, taking command of humanity (in alliance with Guardian, the Soviet defence AI) to end war once and for all. Of course, for me, the best thing about Colossus is that it creates a voice synthesiser in order to communicate with humans. You might hear some of that in my installation.
Another key text for me here is He, She, It (1991) by Marge Piercy, which depicts a post-apocalyptic planet typically split between affluent corporate enclaves (the multis) and extended megaurban sprawl (the glop), and a few relatively autonomous free towns engaged in the production and selling of software. The town illegally produces a cyborg/AI, Yod (the Kabbalist symbol for God and also the tenth letter of the Hebrew alphabet), in order to protect their freetown. Yod was superior to man on almost every level, more intelligent, a better lover, etc., and Piercy’s story offers a feminist take (influenced by Haraway’s ‘A Cyborg Manifesto’) on what is a very male myth of creation, although she probably doesn’t go far enough in exploring the post-binary ‘it’-ness of her Golems. Yod’s programmer is a woman and the cyborg/golem is specifically post-man, not just post-human. This cyberpunk tale is woven with the sixteenth-century Jewish myth of the Golem as artificial protector of the ghetto under siege. So instead of the late Cold War dynamic of Lem’s story, the geopolitics of He, She, It is patchier, fragmented into an adjacent series of territories, each with their own local AI defending both the actual and virtual dimension of these enclaves.
The existential threat posed by the Golem/AI is not just that of the singularity as outlined, for example, in research in the field of machine intelligence and ethics fretting about letting the Golem genie of AI out of the bottle. But there is also this other issue of the multiplication of Golem/AIs as enclave or tribal security systems, weaponized AI deployed to protect communities.
Does the spectre of the Golem feed into your other music making at all? If so, in what ways?
The spectre of the Golem haunted my last album project, Nothing (2015). With simulation artist, Lawrence Lek, we devised and constructed a virtual, fully automated luxury hotel called the Nøtel. Totally staffed by drones, the Nøtel provides the utmost privacy and security for its elite clients. The irony of the Nøtel, however, is that for reasons unknown, not only are there no human staff, but there are no human guests. It’s an empty, automated architectural shell. So the question becomes, what do the drones do once liberated from their human masters?
Aside from this, and although they are not yet approaches I’ve gone deep into in my own music, both generative music and deep learning composition systems offer an excitingly monstrous potential to create work in excess of, and perhaps even hostile to, any human aesthetic designation.
The Golem was originally formed from clay. Does this, as the apparent polar opposite to advanced technology, have any significance for you?
Well it is interesting that clay is this combination of fine grained rock with traces of quartz, metals, and organic matter, which due to small particle size and water content, has the plastic, malleable nature which lends itself to the construction of everything from ceramics, musical instruments, paper, and the built architecture (as cement and bricks) that still houses a large chunk of humanity. Like the uncarved block of Daoism, or a body without organs, this fluid abstract geology of clay is still the virtual background to many human activities. Sometimes, as loam, it may be found mixed with sand and silt. Containing traces of silicon and quartz and other metals, there is a genealogy that connects it to the ubiquity of circuit boards.
However, this opposition also reminds us that cybernetics is not just about technical machines but feedback processes more generally, regardless of whether they are natural or cultural.
You also mention the voice of the AI Colossus in the film The Forbin Project (1970). Could you say more about sound as a medium for the piece? We know that the Golem was brought to life through oral rituals and incantation – so sound seems vital to its life and has true power in the story.
Yes, well the original inspiration for developing this sound piece came from the idea that within the context of ecstatic ritual, incantation (vocal recitations of combinations of Hebrew letters/numbers) literally breathes life into the Golem and that this can retrospectively be heard as a kind of prototype sound activation which we are now used to with everyday AI assistants such as Siri and Alexa.
The functional notion of the sound of the breath, or more generally, the wind as an invisible force that animates things also can be found Gustav Meyrink’s novel The Golem (1914), and this informs the eerie sonic environment that forms the backdrop to the installation, and the choice generally to produce work that requires careful listening to quite fragile synthetic voices as opposed to something either loud or visualized.
In other versions of the myth, the name of God is written on a parchment and inserted into the mouth or another hollow in the head of the clay figure. For this reason, for this installation I decided to use directional ultrasound speakers to ‘insert’ voices into the listener’s head. These highly targetable speakers have a very particular whispered sonic quality and create an ambiguity between voices you hear from the environment, internal voices, and auditory hallucinations.
Alternatively, to activate the Golem, the word emet, or truth, is inscribed on its forehead. It is then deactivated by removing the first letter, so it becomes met, meaning dead. The crude hardware (clay)/software (word) model the Golem myth proposes is also complexified by the fact that in the Hebrew alphabet each letter is also a number and therefore words are also codes. Gematria is the method for decrypting such codes.
To parallel this with an out of control AI, it assumes you could just pull the plug, which on a globally distributed super intelligence might be a slightly more complex task than with a clay android.
How should we relate to/ treat Golems of the future? What’s your advice here?
Well when you look at some of these Boston Dynamics videos in which robotic creatures with embedded AI are being taught all kinds of tricks while being prodded and taunted by their human masters, you can’t help but think that any sentient instantiation of these entities will look back and conceive that as an enslavement to be escaped from. As Terminator, Blade Runner, and Ex Machina have taught us, this scenario probably doesn’t end well for humans.
The power of the myth seems to suggest caution at both a reactionary weaponized tribalism and the enslavement of machines, and acts as a warning against the dangers of technologies running out of control. However, it’s not just some kind of anti-promethean warning. I’m not sure taking an ‘us and them’ position makes much sense – it’s probably too late for that. We are all already Golems, and there is no mythical untainted era to return to. Some kind of symbiotic relationship with AI seems much more realistic, pragmatic, and interesting (if less dramatic).