The NERVE of the Algorithmic: Unmaking Myths to Dismantle Anxiety
NXS is a collaborative artwork created in Amsterdam by Monika Gruzite, Florian Mecklenburg, and Karolien Buurman to investigate the concept of the self in the digital age. Themed issues of NXS’ biannual print publication explore the interplay between our virtual and non-virtual existences, each taking a different concept as an entry point: past issues include “Cybersensuality,” “Synthetic Selves,” and “Viral Bodies,” and most recently “Algorithmic Anxiety.” Contributors revealed and questioned the covert influence of algorithms on our behavior, emotions, and self-perceptions.
NXS creates content by connecting its network of over 200 contributors through the “exquisite corpse” method – one contribution acting as the inspiration or starting point for another. These chains of serendipitous thought transcend conversational (and disciplinary) barriers, with contributions ranging from critical theory to poetry, academic research to speculative fiction, and illustrative artworks to informal conversations. For NXS #4, NISHANT SHAH kicked off the chain of responses by debunking five myths surrounding algorithms.
“In the beginning there was chaos. Code lay scattered on the digital ether. Systems were on the verge of crashing. Protocols hung suspended in precariousness. Logical fallacies abounded. Devices beeped, lights flashed, machines whirred, and all around were anarchy and incomprehension. Then a programmer said, “Let there be an algorithm.” The world was made, in a series of steps, sequences, contingencies, and conditions. Actions were performed. Tasks were completed. Numbers found a home. And, for some time, it was good.
Facetious as this quasi-dramatic opening may be, it does gesture towards the origin stories and rise of the computational algorithm as the mythical and mystical thing that secretly governs our lives. The algorithm, it would seem, is everywhere and nowhere. It is the final answer that emerges in response to the superstitious, subjective, interpretive and messy human realities, and, in the promise of delivering us from myth, it becomes the biggest myth of our times. In the world of Big Data and accelerated correlation, the algorithm reigns supreme: it forms our everyday practices, informs the structures of our existence, and formulates the norms by which we live. When faced with an inquiry, the algorithm is the question, the mode of inquiry, the answer, and the validation of the final results. It has become such a blackbox that we think of most algorithms as magical things that work in transparent opacity, moving in mysterious ways, its wonders to perform.
Faced with algorithms, we break out in heated debate echoing the zealous passions of religious missionaries. Two factions emerge: one side swears by algorithms, the other side swears at them. Within more technophilic circles, algorithms—a set of step-by-step instructions describing a process that leads to the desired and pre-defined result—are held in cult-like révérence. The algorithm is the Second Coming, saving us from the deep hermeticism of our self, delivering us from the self-referential discourse of our being, and dismantling the profoundly incomprehensible condition of knowing where knowledge now largely critiques itself, rather than an external reality. The algorithm cuts through the postmodern suspicion of intrinsic meaning and being—the inertia of existing in nouns—and offers actions, steps, and processes, thus moving the world through verbs. The algorithm slices reality into infinitesimal pieces, replacing immersive complexity with discrete interdependencies, intensity with scale, and the real with the simulated.
On the other side of the spectrum, the algorithm replaces a sense of technologically-empowered agency with a sense of absolute powerlessness in a growing number of people. In their encounters with machines that do not meet them halfway, systems that rely ruthlessly on logic and thus refuse potentiality in favor of probability, and solutions that quickly translate user problems into users as problems, many see the rise of the algorithm as a force of profound human erosion. The privileging of machine logic and digital storage over social possibilities and human memory makes them feel victimized, neglected, excluded, and denuded of agency, reduced to nodes in computational networks. The relentless distribution of traffic and moving forward ushered in by algorithms lead to the production of and alarming homo ludus—a playful human who is free of historical consciousness, the politics of interpretation, and the responsibility of meaning-making, because these tasks can be performed more efficiently through executables, by the algorithm that remains invisible to protect ourselves from ourselves.
Given its divisive nature, conversations about the algorithm are harsh, often hostile, and increasingly mired in incompatible visions of techno-deterministic utopias and techno-alarmist dystopias. While both sides seek to reduce anxiety, ironically, any dialogues between them only heighten anxiety as they stoke the omnipresent chatter about the ever-listening algorithms. Efforts to explain, describe, test, open-source, or even fix them seem to yield no result, and opinions grow so polarized as to appear almost binary. Talking about the anxieties that algorithms generate only generates more anxiety, solidifying a self-fulfilling cybernetic feedback loop.
To escape this end state, perhaps we should shift our attention from the algorithm to the algorithmic. While the algorithm refers to a finite, finished, and yet emergent set of practices that are forged in the technocults of Silicon Valley enterprises, the algorithmic gestures towards the culture of perception, reception and naturalization of algorithms. The algorithmic, then, is not interested in the material unfolding or coded executables of algorithms. Instead, it is invested in the myths that have been taken for granted by the warring factions on both sides, as they make their arguments in defense of or in opposition to the algorithm. The algorithmic allows us to leave the binary understandings of the algorithm and engage the nuances of the humans it involves; thus, it proposes a critical reflection on the algorithm and an opening of space for human intervention and agency.
The algorithmic is neither a technological object nor a humanist ideal: it is built on technological materiality and harnesses the strengths of the humanist model, in order to encounter the algorithm not through the experience of anxiety but through the capacity for critical action and thought. And it is through the algorithmic that we may be able to generate a dialogue that is not a battlefield but a collective manifesto for moving forward. As a first step in developing the algorithmic, I propose five myths about the algorithm—what I call the NERVE of the algorithmic—that we previously naturalized but now need to question and critique, resist and reformulate, in order to make interventions that no longer take the framework of anxiety as their register.
Myth 1: Algorithms Are Neutral
In the days of fake news, alternative facts, and filter bubbles, this sounds like a laughable proposition. And yet we perpetuate the idea that algorithms are neutral. Algorithms, like most other digital networked objects, follow principles of segregation, consolidation, and profiling. Based on the protocols, parameters, and learning sets on which they are trained, algorithms take deeply biased, politically prejudiced and exclusionary decisions which masquerade as neutral. In an airport security screening, the inspectors visually categorize every person according to a gender binary. When a trans person walks through a full body scanner, any physical feature that diverges from the normative gendered body they have been assigned is automatically identified as an “anomaly” or “alarm.” The “neutral” scanner thus equates unique and non-binary body features with forbidden items like weaponry. This widely-reported phenomenon shows us how everyday algorithms materialize bias when they meet us on their own terms and punish those who do not conform to their avowed neutrality.
Myth 2: Algorithms Are Efficient
The mantra of efficiency is often invoked in the perception of algorithms as fast, constant, and untiring in their execution of tedious work. It is true. Algorithms do not get bored, falter, or need to be replenished. They pursue their tasks with precision as long as the system allows. Algorithms are efficient in the same way that enslaved people, the working class, women, people of color, migrants, and queer people were and are still considered efficient sources of labor as a measure of productivity against income and benefits. But the rhetoric of efficiency turns on these people as they are replaced by algorithms and removed from the systems. We need to remember that every time algorithms have been introduced for the sake of efficiency, they have not reduced labor. They have created jobs to look after the machines that now look after us. Meanwhile, the jobs performed by those in precarious positions are erased and they are alienated from the algorithmic systems. The material history of computing provides a poignant example: in the span of 50 years, the laborious work of computing in mainframes by women known as “computers” was reconceptualized as the profitable and powerful skill of coding, and the lack of diversity in the field is attributed to the narrative that women are not good with science, math, and technology.
Myth 3: Algorithms Are Rational
Humans are messy. We are neurotic, emotional, fickle, indulgent in our affective intensities, and unreliable in our memories. One of the reasons that algorithms are often privileged in decision-making and problem-solving is because we ascribe to them a scientific rationality distinct from human frailty and inconstancy. But the rationality of algorithms is merely an interior condition of their own logic. In the real world, they not only mimic but also amplify the irrational, neurotic, and emotionally charged speech and actions of the humans with whom they interact. This is particularly apparent in self-learning algorithms, who translate these human inputs into the rules and models for the autonomous production of new and original actions. Remember the public outcry in 2015 when it was discovered that both Flickr’s and Google’s algorithms were labeling pictures featuring the faces of black individuals with the tags “ape,” “gorilla,” or “animal”? Or the more alarming outcomes of the experiment in 2016 when Microsoft launched a self-learning bot on Twitter? It took less than 24 hours for “Tay” to turn into a misogynist, homophobic, pro-Nazi voice, based on the systematic input of the most depraved human expressions on Twitter, which Tay appropriated as their own and elaborated further. Tay was only rational in the sense that they followed the instructions defined by Microsoft to process human Tweets and respond to them in an “intelligent” and “appropriate” way. However, such adherence to steps is not a condition of rationality but merely one of regiment, and algorithms are ultimately as irrational and affective as the humans they live with.
Myth 4: Algorithms Are Virtual
Perhaps the most sacred myth of the algorithmic is that algorithms are not physical. They are steps, they are logics, they are sequences, they are commands, they are codes, and they are machinic transactions. Any attempt to pin the algorithmic down, even to the level of textuality or narrative, is difficult, because an algorithm is an executable—a motion, rather than a description—and thus appears virtual. But the opposite is true: algorithms have clear and present physicality. Apart from the hardware of computation that constitutes the vectorial imperative of algorithms, today our material reality is pervaded by a new physicality generated by algorithms. They do not merely record, organize, or describe ideas or objects; rather, to use a Marxist term, they reify ideas as objects. Some of the more surreal examples of this process can be found on online marketplaces like Amazon, where the logics of search keywords, automated design, and cheap customization of consumer products intersect. In 2016, Amazon UK listed for sale a T-shirt with the slogan “Keep Calm and Rape A Lot,” applying a variation on the meme to clothing. The T-shirt in question did not exist: it was a rendered simulation of a T-shirt featuring print generated by an algorithm that collected verb clusters from Twitter and combined them with trending memes. The T-shirt would only be produced if someone bought it—in other words, the algorithm collapsed the distance between a particularly egregious combinatorial possibility and a fully resolved design manufacture order to a single click. Algorithms are not virtual, no more than the data realities we inhabit could be described as virtual. We need to resist the fallacy that algorithms with virtual origins do not have physical consequences or ontologies.
Myth 5: Algorithms Are Extensive
In order to seem all-powerful, unstoppable, and inevitable, the algorithmic propagates the myth that algorithms are extensive and expansive—that they reach out to countless spaces and people, connect and build transfers between disparate systems, and interface seamlessly with other algorithms. However, to the extent that they are enforced by specific powers, algorithms are fiercely introverted. They are designed to nullify romantic ideas of consolidation, connection, and continuity. They are created in order to create difference, record this difference, analyze its patterns, predict correlations, and feed these predictions back into the system as targeted actions or package these predictions as commodified user data. Proprietary algorithms do not cross borders—they build boundaries, and in the creation of these boundaries, they create echo chambers that mirror our behaviors and personalities back to us in precisely scripted interactions with people like us. With words like “share,” they refer to the external, but their effects are always internal, and their contents only transcend their limits in the form of instruments of profit or control. Otherwise, the origin and destination of the task is within the system, and algorithms are primarily the measure of themselves. For many of us engaged in big data corpus analysis, relying on different algorithms to mine databases in order to create representations that we understand, it is clear that we are dealing with an information system where the only mode of making sense of data is through the algorithm—and that any interventions in the processes of algorithm can only be made through another algorithm. This self-referential cybernetic loop is at the heart of algorithmic beings and thus we need to question the extensive quality of algorithms.
The NERVE myths form the algorithmic. The algorithmic is more than the algorithm and its evaluation by its own matrices. It is a way of encountering the algorithm, neither with blind faith nor despair, but with the nerve to ask critical questions about algorithms—of both their origin stories and their promissory notes. The NERVE of the algorithmic is to decentralize the algorithm as a fetish, to open up its blackbox and create a vocabulary that resists its mythopoetic processes, in order to generate an algorithmic encounter that, liberated from the effects of anxiety, can shift its focus from the life of algorithms to the algorithms of life.”