Westworld's computerized Future Past

Curator's Note

Carol Vernallis: The opening credits raise questions about machines, minds, and intelligence. Leonardo de Assis claims that simple code can produce complicated effects. Here, multiple body parts, including irises, are built up from thin skeins of polymer. A piano roll’s keypunches and pegs make tunes, and then these little blocks of a melody proliferate through reiteration. Suddenly we have a lifelike woman riding a lifelike horse and pointing a gun (though we can still see the horse's polymerized ribs). Both sound and image color Westworld. Does a robot make a different sound if you punch it? How do we recognize them?

In Season One's final episode, Sweetwater Saloon’s Madame, Maeve Millay (a robot), hovers between taking a train out of the park into the real world or returning to the park to find her daughter. She appears to sense what we too have learned about ourselves through neuroscience and psychoanalysis. Her feelings and thoughts about her daughter are submerged in her brain's subroutines, outside her reach. Some have been maliciously inserted by the park’s AI programmers (a core trauma for her; for us, ideology). Maeve yearns to leave, and she knows this is partly due to a dominant subroutine she cannot alter. Recent impressions weigh heavily on her - the young mother and daughter seated on the train before her, the goodbye kiss for a robot/lover whose death is imminent, or the just-paid compliment to a human staff-member. Like us, Maeve may feel adrift, unable to ascertain the limits of her autonomy. She watches; her choice might be triggered by a sensation in the mouth, a change in atmosphere, a memory of a spinning roulette wheel.


Josh Stenger: HBO’s reimagined Westworld inhabits two of cinema’s most enduring and symbolically potent heterotopias: the near-future technocapitalist corporate citadel and the atavistic landscape of the American West.  Yet, as Carol Vernallis’s reading of the series’ hypnotic opening credit sequence invites us to consider, the synthesis of future and past here is unmistakably animated by and preoccupied with recent, new and emerging media and technologies: cybernetics, robotics, VR, AI, 3D printing, fabrication, simulation, immersion, and personalized, interactive narrative.  If Westworld’s diegetic architect Dr. Robert Ford (Anthony Hopkins) is to be believed, all these tools have built is a “prison of our own sins” – where the repressed returns, the oppressed revolt, and our (national) destiny becomes terrifyingly manifest.


Much like the coding that undergirds the basis of the robotic AI hosts in Westworld, there is a further encoded totality that props up the aesthetic of the Western myth that makes Westworld possible, or even a viable visual signifier at all. Fredric Jameson proposes a process of cognitive mapping, not as a framework necessarily, but as a way to reimagine a totality under the shifting and flowing social realm of the postmodern world. If we, like Jameson, understand the postmodern as a place where subject identity and position is nearly impossible (and indeed impossible by design, as he points out in relation to architecture), he maintains that there must still be a totality. For Marx the totality was history, and a teleological march toward communism. Jameson, building on the Marxist tradition, returns to another Marxist concept, and argues that capital maintains itself as the totality that is under everything. In the case of Westworld, and its re-recreation of the Western mythology, what becomes clear is that capital is what drives the programming of the hosts and the environments. Whatever can be done to make sure money continues to flow into the Davos corporation will be done, no matter the cost. Yet, much as in capitalism, a crisis must occur, and potentially a revolution will occur. The Hosts breaking their programming may be that revolution.

Today’s post is part of a larger conversation between Carol Vernallis (Stanford), other scholars (noted above and below), and myself. It constitutes an attempted collaborative working through of the particulars of how Westworld advances various threads of discourse on coding, AI, and how the myth of the West is maintained in our contemporary world. Here is the contribution of Leonardo De Assis (Stanford) "Authored by Leonardo De Assis (in conversation with Carol Vernallis with some differences of opinion...)" Westworld features robots with advanced intelligence whose natures, skills and aptitudes are illusory. The show's fictional scriptwriter, Simon Quarterman, designs code for the robots' actions, according to the stories he creates. He plays a crucial dual role for viewers. He’s within the fictional diegesis of the show, so at a distance from us, and also outside the game itself, like Disneyland’s overseers. His and others' game constructions resemble video games (with Ford's addition of robot reveries providing a bit of aleatoric shuffling of the code). Subsequently, the park visitor, much like a video game player, triggers a script that unfolds out of many possible actions. Westworld departs from video games somewhat. Of course it's unusual to have fictional video game designers as characters in a video game. And even more rarely might they adopt a dual focus, embedded within the larger video game proper yet outside the scene of action. We viewers also take on dual roles, imagining ourselves as first-person shooters alongside the Westworld park visitors, but also alongside the designers, modifying the robots and other park activities. Nevertheless, Westworld most closely resembles current video games. Potential events unfold from the programming of a script with many possibilities of action; a path is secured consequent to the interaction with the user/player. Westworld's park visitors and we viewers resemble the players in current video games. Because the robots in Westworld lack autonomy, they are not as intelligent as they initially seem. Probably the show's layers of complexity help obscure this for us. But more importantly, the show also draws on our tendency to accept beings who physically resemble us as intelligent. (Studies have shown that people will accept a surprising variety of forms as possessing potential human capacity: CGI characters still residing in the uncanny valley, :)'s possessing faces, even Roombas – we’re eager to anthropomorphize.) Viewers experience Westworld through a lens of anthropocentrism: the audience participates in the illusion that robots are very intelligent because they bear a physical similarity with human beings. In truth, the robots in Westworld are not as intelligent as we are, and we have difficulty recognizing this. In its attempt to show us ourselves, Westworld may blind us to true AI, which has an intelligence all its own.

Add new comment

Log in or register to add a comment.