Algorithms are everywhere, supposedly. We are living in an “algorithmic culture,” to use the author and communication scholar Ted Striphas’s name for it. Google’s search algorithms determine how we access information. Facebook’s News Feed algorithms determine how we socialize. Netflix’s and Amazon’s collaborative filtering algorithms choose products and media for us. You hear it everywhere. “Google announced a change to its algorithm,” a journalist reports. “We live in a world run by algorithms,” a TED talk exhorts. “Algorithms rule the world,” a news report threatens. Another upgrades rule to dominion: “The 10 Algorithms that Dominate Our World.”
Here’s an exercise: The next time you hear someone talking about algorithms, replace the term with “God” and ask yourself if the meaning changes. Our supposedly algorithmic culture is not a material phenomenon so much as a devotional one, a supplication made to the computers people have allowed to replace gods in their minds, even as they simultaneously claim that science has made us impervious to religion.
It’s part of a larger trend. The scientific revolution was meant to challenge tradition and faith, particularly a faith in religious superstition. But today, Enlightenment ideas like reason and science are beginning to flip into their opposites. Science and technology have become so pervasive and distorted, they have turned into a new type of theology.
The worship of the algorithm is hardly the only example of the theological reversal of the Enlightenment—for another sign, just look at the surfeit of nonfiction books promising insights into “The Science of…” anything, from laughter to marijuana. But algorithms hold a special station in the new technological temple because computers have become our favorite idols.
In fact, our purported efforts to enlighten ourselves about algorithms’ role in our culture sometimes offer an unexpected view into our zealous devotion to them. The media scholar Lev Manovich had this to say about “The Algorithms of Our Lives”:
Software has become a universal language, the interface to our imagination and the world. What electricity and the combustion engine were to the early 20th century, software is to the early 21st century. I think of it as a layer that permeates contemporary societies.
This is a common account of algorithmic culture, that software is a fundamental, primary structure of contemporary society. And like any well-delivered sermon, it seems convincing at first. Until we think a little harder about the historical references Manovich invokes, such as electricity and the engine, and how selectively those specimens characterize a prior era. Yes, they were important, but is it fair to call them paramount and exceptional?
It turns out that we have a long history of explaining the present via the output of industry. These rationalizations are always grounded in familiarity, and thus they feel convincing. But mostly they are metaphors. Here’s Nicholas Carr’s take on metaphorizing progress in terms of contemporary technology, from the 2008 Atlantic cover story that he expanded into his bestselling book The Shallows:
The process of adapting to new intellectual technologies is reflected in the changing metaphors we use to explain ourselves to ourselves. When the mechanical clock arrived, people began thinking of their brains as operating “like clockwork.” Today, in the age of software, we have come to think of them as operating “like computers.”
Carr’s point is that there’s a gap between the world and the metaphors people use to describe that world. We can see how erroneous or incomplete or just plain metaphorical these metaphors are when we look at them in retrospect.
Take the machine. In his book Images of Organization, Gareth Morgan describes the way businesses are seen in terms of different metaphors, among them the organization as machine, an idea that forms the basis for Taylorism.
We can find similar examples in computing. For Larry Lessig, the accidental homophony between “code” as the text of a computer program and “code” as the text of statutory law becomes the fulcrum on which his argument that code is an instrument of social control balances.
Each generation, we reset a belief that we’ve reached the end of this chain of metaphors, even though history always proves us wrong precisely because there’s always another technology or trend offering a fresh metaphor. Indeed, an exceptionalism that favors the present is one of the ways that science has become theology.
In fact, Carr fails to heed his own lesson about the temporariness of these metaphors. Just after having warned us that we tend to render current trends into contingent metaphorical explanations, he offers a similar sort of definitive conclusion:
Today, in the age of software, we have come to think of them as operating “like computers.” But the changes, neuroscience tells us, go much deeper than metaphor. Thanks to our brain’s plasticity, the adaptation occurs also at a biological level.
As with the machinic and computational metaphors that he critiques, Carr settles on another seemingly transparent, truth-yielding one. The real firmament is neurological, and computers are fitzing with our minds, a fact provable by brain science. And actually, software and neuroscience enjoy a metaphorical collaboration thanks to artificial intelligence’s idea that computing describes or mimics the brain. Computing-as-thought reaches the rank of religious fervor when we choose to believe, as some do, that we can simulate cognition through computation and achieve the singularity.
* * *
The metaphor of mechanical automation has always been misleading anyway, with or without the computation. Take manufacturing. The goods people buy from Walmart appear safely ensconced in their blister packs, as if magically stamped out by unfeeling, silent machines (robots—those original automata—themselves run by the tinier, immaterial robots algorithms).
But the automation metaphor breaks down once you bother to look at how even the simplest products are really produced. The photographer Michael Wolf’s images of Chinese factory workers and the toys they fabricate show that finishing consumer goods to completion requires intricate, repetitive human effort.
Eyelashes must be glued onto dolls’ eyelids. Mickey Mouse heads must be shellacked. Rubber ducky eyes must be painted white. The same sort of manual work is required to create more complex goods too. Like your iPhone—you know, the one that’s designed in California but “assembled in China.” Even though injection-molding machines and other automated devices help produce all the crap we buy, the metaphor of the factory-as-automated machine obscures the fact that manufacturing isn’t as machinic nor as automated as we think it is.
The algorithmic metaphor is just a special version of the machine metaphor, one specifying a particular kind of machine (the computer) and a particular way of operating it (via a step-by-step procedure for calculation). And when left unseen, we are able to invent a transcendental ideal for the algorithm. The canonical algorithm is not just a model sequence but a concise and efficient one. In its ideological, mythic incarnation, the ideal algorithm is thought to be some flawless little trifle of lithe computer code, processing data into tapestry like a robotic silkworm. A perfect flower, elegant and pristine, simple and singular. A thing you can hold in your palm and caress. A beautiful thing. A divine one.
But just as the machine metaphor gives us a distorted view of automated manufacture as prime mover, so the algorithmic metaphor gives us a distorted, theological view of computational action.
“The Google search algorithm” names something with an initial coherence that quickly scurries away once you really look for it. Googling isn’t a matter of invoking a programmatic subroutine—not on its own, anyway. Google is a monstrosity. It’s a confluence of physical, virtual, computational, and non-computational stuffs—electricity, data centers, servers, air conditioners, security guards, financial markets—just like the rubber ducky is a confluence of vinyl plastic, injection molding, the hands and labor of Chinese workers, the diesel fuel of ships and trains and trucks, the steel of shipping containers.
Once you start looking at them closely, every algorithm betrays the myth of unitary simplicity and computational purity. You may remember the Netflix Prize, a million dollar competition to build a better collaborative filtering algorithm for film recommendations. In 2009, the company closed the book on the prize, adding a faux-machined “completed” stamp to its website.
But as it turns out, that method didn’t really improve Netflix’s performance very much. The company ended up downplaying the ratings and instead using something different to manage viewer preferences: very specific genres like “Emotional Hindi-Language Movies for Hopeless Romantics.” Netflix calls them “altgenres.”
While researching an in-depth analysis of altgenres published a year ago at The Atlantic, Alexis Madrigal scraped the Netflix site, downloading all 76,000+ micro-genres using not an algorithm but a hackneyed, long-running screen-scraping apparatus. After acquiring the data, Madrigal and I organized and analyzed it (by hand), and I built a generator that allowed our readers to fashion their own altgenres based on different grammars (like “Deep Sea Forbidden Love Mockumentaries” or “Coming-of-Age Violent Westerns Set in Europe About Cats”).
Netflix VP Todd Yellin explained to Madrigal why the process of generating altgenres is no less manual than our own process of reverse engineering them. Netflix trains people to watch films, and those viewers laboriously tag the films with lots of metadata, including ratings of factors like sexually suggestive content or plot closure. These tailored altgenres are then presented to Netflix customers based on their prior viewing habits.
Despite the initial promise of the Netflix Prize and the lurid appeal of a “million dollar algorithm,” Netflix operates by methods that look more like the Chinese manufacturing processes Michael Wolf’s photographs document. Yes, there’s a computer program matching viewing habits to a database of film properties. But the overall work of the Netflix recommendation system is distributed amongst so many different systems, actors, and processes that only a zealot would call the end result an algorithm.
The same could be said for data, the material algorithms operate upon. Data has become just as theologized as algorithms, especially “big data,” whose name is meant to elevate information to the level of celestial infinity. Today, conventional wisdom would suggest that mystical, ubiquitous sensors are collecting data by the terabyteful without our knowledge or intervention. Even if this is true to an extent, examples like Netflix’s altgenres show that data is created, not simply aggregated, and often by means of laborious, manual processes rather than anonymous vacuum-devices.
Once you adopt skepticism toward the algorithmic- and the data-divine, you can no longer construe any computational system as merely algorithmic. Think about Google Maps, for example. It’s not just mapping software running via computer—it also involves geographical information systems, geolocation satellites and transponders, human-driven automobiles, roof-mounted panoramic optical recording systems, international recording and privacy law, physical- and data-network routing systems, and web/mobile presentational apparatuses. That’s not algorithmic culture—it’s just, well, culture.
* * *
If algorithms aren’t gods, what are they instead? Like metaphors, algorithms are simplifications, or distortions. They are caricatures. They take a complex system from the world and abstract it into processes that capture some of that system’s logic and discard others. And they couple to other processes, machines, and materials that carry out the extra-computational part of their work.
Unfortunately, most computing systems don’t want to admit that they are burlesques. They want to be innovators, disruptors, world-changers, and such zeal requires sectarian blindness. The exception is games, which willingly admit that they are caricatures—and which suffer the consequences of this admission in the court of public opinion. Games know that they are faking it, which makes them less susceptible to theologization. SimCity isn’t an urban planning tool, it’s a cartoon of urban planning. Imagine the folly of thinking otherwise! Yet, that’s precisely the belief people hold of Google and Facebook and the like.
Just as it’s not really accurate to call the manufacture of plastic toys “automated,” it’s not quite right to call Netflix recommendations or Google Maps “algorithmic.” Yes, true, there are algorithms involved, insofar as computers are involved, and computers run software that processes information. But that’s just a part of the story, a theologized version of the diverse, varied array of people, processes, materials, and machines that really carry out the work we shorthand as “technology.” The truth is as simple as it is uninteresting: The world has a lot of stuff in it, all bumping and grinding against one another.
I don’t want to downplay the role of computation in contemporary culture. Striphas and Manovich are right—there are computers in and around everything these days. But the algorithm has taken on a particularly mythical role in our technology-obsessed era, one that has allowed it wear the garb of divinity. Concepts like “algorithm” have become sloppy shorthands, slang terms for the act of mistaking multipart complex systems for simple, singular ones. Of treating computation theologically rather than scientifically or culturally.
This attitude blinds us in two ways. First, it allows us to chalk up any kind of computational social change as pre-determined and inevitable. It gives us an excuse not to intervene in the social shifts wrought by big corporations like Google or Facebook or their kindred, to see their outcomes as beyond our influence. Second, it makes us forget that particular computational systems are abstractions, caricatures of the world, one perspective among many. The first error turns computers into gods, the second treats their outputs as scripture.
Computers are powerful devices that have allowed us to mimic countless other machines all at once. But in so doing, when pushed to their limits, that capacity to simulate anything reverses into the inability or unwillingness to distinguish one thing from anything else. In its Enlightenment incarnation, the rise of reason represented not only the ascendency of science but also the rise of skepticism, of incredulity at simplistic, totalizing answers, especially answers that made appeals to unseen movers. But today even as many scientists and technologists scorn traditional religious practice, they unwittingly invoke a new theology in so doing.
Algorithms aren’t gods. We need not believe that they rule the world in order to admit that they influence it, sometimes profoundly. Let’s bring algorithms down to earth again. Let’s keep the computer around without fetishizing it, without bowing down to it or shrugging away its inevitable power over us, without melting everything down into it as a new name for fate. I don’t want an algorithmic culture, especially if that phrase just euphemizes a corporate, computational theocracy.
But a culture with computers in it? That might be all right.