http://ift.tt/2iGCFVX
Before I’d seen a single episode of Westworld, a journalist reached out to me for comment about it. The show touches on the question of AI consciousness, narrative design, the evocation of empathy with non-player characters, and the morality of gameplay, which may be why it seemed like I might have some thoughts.
If you’re not familiar with the series, it’s an HBO show in which rich people can visit an incredibly detailed western-styled theme-park full of AI-driven robotic characters, called hosts. The human players are called guests. It quickly becomes obvious that they come to Westworld primarily in order to misbehave: they have sex with the prostitutes at the brothel; they assault the daughters of the ranchers; they maim and murder hosts casually or with elaborate sadism. And all of these things take place within the context of storylines crafted by narrative designers and then meticulously supervised. The hosts are able to improvise slightly in response to the input of humans, but then eventually they will revert to their core loops, playing out the same narrative tracks over and over again.
Naturally, when initially asked, I said that I hadn’t watched it and so couldn’t offer much direct insight. I added that I thought full AI consciousness in the sense imagined by science fiction was some way off, and that art about that possibility is often really about something else: about groups of people who feel entitled to the labor and service of others; about the self-perpetuating, semi-human, half-programmed entities that already exist in our world in the form of governments and corporations.
I have now seen Westworld, and I still think the same. The narrative design and the AI aspects are handled competently enough to frame the story, but Westworld is by no means a master class in what interactive narrative design actually entails, or how people actually use games to explore their morality, or again in the way a sort of pseudo-personality emerges in current AI research.
To the extent there’s a thematic point here of interest, it is about the nature of human identity, the role of suffering in how we understand ourselves, and the ways we construct ourselves. I would have liked to see the show go much further, but the plot is often allowed to trump the characters.
I am now going to dig into all those assertions in more detail, with spoilers. If you are interested in watching the show yourself, you should probably do that before going forward.
So let’s peel this apart:
The narrative design and the AI aspects are handled competently enough to make for an acceptable framework. At points, Westworld feels not a million miles from a western-themed open world game; one of the interactions with a prostitute reminded me of a sequence out of Red Dead Redemption. Likewise, there are moments of text generation that feel reminiscent of a current-day chatbot. A lot of these bits are recognizable from current video game experiences, so they don’t feel “off,” and the story can proceed around them plausibly enough.
There’s a lot else that really makes no sense at all if you think about it for even a moment. The economics, for one: $40,000 a day for the park experience sounds like a lot, but an average guest does enough damage to account for probably millions of dollars’ worth of repairs.
The timing for reset and cleanup: we get a sense for how much work has to be done on the hosts between scenes; doesn’t that mean the park has really extended periods of economically damaging down-time? And ignoring that: how long does it take to reset the sets? To replace the broken windows and bottles, to sweep out the glass, to take down the signs and repair the bullet holes? How long does it take to hose down and disinfect these settings, covered with blood and semen from real people as well as the (presumably) fake and pathogen-free blood of the hosts?
The security: how do you adequately prevent guests from assaulting one another, in an environment where hosts and guests look alike, and violence and rape are winked at? (You might think the hosts would be programmed to break up human-human violence, but we see plenty of times that they’re not.)
The power sourcing: how much of the host bodies are biological? For the parts that are mechanical, where and how are they charging up?
We don’t get into all that because it’s not really the point. Fair enough, I suppose, though I did itch at the economic point a bit. Several times the script acknowledges that this is a park that exists not just for humans, but for obscenely wealthy humans. Thousands of other workers exist to keep it running but couldn’t ever possibly afford to participate in it. But the phenomenal human-vs-human inequality of this world is not really treated as a problem compared with the human-vs-AI inequality.
Overall, Westworld replicates, without doing much to interrogate, a number of current issues about representation in games and other media. The park portrays its women as tokens and targets and sexual goals; it portrays its Native Americans as ruthless and brutal, though touched by mysticism. The show occasionally makes remarks that seem to acknowledge that these are issues, but it doesn’t really delve into them.
It doesn’t really get into what interactive narrative design actually entails. Westworld shows you how players are invited into storylines, and hints that stories can partially repair themselves if things go off-track. We see specific storylines play out in several ways that feel like branch-and-bottleneck storytelling: this character comes to town and either gets shot or steals a safe; if he steals the safe, he then gets into a fight with other bandits, and the story ends with him dead, as it must. And we see some dynamic story healing: some of the narratives are written so that various characters can in theory fill a particular role, but it doesn’t matter too much which one.
But there are other things you’d need in a system like this that I didn’t see. For instance: tracking player knowledge. What information has been revealed to which players? Does everyone know what they need to know in order to get a satisfying arc from this story? A multiplayer experience combined with real-world scope/hearing issues makes this extra tricky, because you can have players walking in and out of one another’s scenes, or telling one another plot information, and you have to somehow account for all that and make sure everyone understands what’s going on well enough to have a good time.
Or: tutoring and pacing so that the story doesn’t lag and adjusts to different player skill levels. Setting up player agency so that people have a sense for where their choices are going to lead them. Managing the culture of players and communicating within the expectations that surround stories in the park. Helping shy players past situations that make them uncomfortable or uncertain. Communicating through your mechanics.
It’s fine, I suppose, that we don’t really delve into any of those things (or any of a dozen other things I could name) because the show is really not about that. We don’t get to see Ford doing much of his design work directly, and the other, junior narrative designer exists mostly to swagger and to craft scenes in even worse taste than the rest of the park.
It’s also not really about how people actually use games to explore their personalities or morality. Westworld pretty much assumes that players — at least all the players we see — are using the park to act out power fantasies in a lawless environment. When one enters the park, one has the super-unsubtle opportunity to choose a black hat or a white one, but the white hat experience is often about being a rescuer, which is another kind of power fantasy. And many of the characters we see using the park are brutally destructive, especially in the later episodes. The park caters to the very rich, and in practice we mostly see the “gameplay” of male guests, though we know there are female ones.
Not shown: how it affects other guests — perhaps including past survivors of sexual violence — to be around when some particularly aggressive guest gets going. Bleed. The need for a break after a powerful scene. All the protections that LARPers and tabletop storygamers have developed in order to avoid hurting one another, even during play circumstances much less overwhelming than Westworld‘s 24/7-live-sex-and-gore scenario.
In fact, Westworld doesn’t really even spend much time on the distinction between portraying a character and being yourself in the Westworld park. Black hat characters are presented essentially as players who have just let their ids off the leash, who are experimenting with how far they’re willing to go. Meanwhile, character choice at the beginning of a game is all about picking yourself an outfit. Pretending to be a person with a different set of strengths and weaknesses from your real world self? Not really developed here.
So again, this just isn’t really what Westworld is about. This particular story requires a premise in which the guests are straightforwardly exploiting the hosts almost all of the time, for selfish reasons. It is not seriously exploring what a 24/7 high-fidelity LARP would feel like, or what impact it might have on its players. Which is fine, but I would caution against reading Westworld as a meaningful critique of in-game morality or a prognostication about how people will use VR.
It’s not about the current promise of AI. Most impressive recent developments in AI depend on doing a lot of processing on a very large dataset.
The result is that current AI often acts as a strange mirror, replying to queries with answers that make a sort of sense and yet are instructively off from what we’d expect a human being to say or do. DeepDream studs ordinary photographs with eyeballs. When trained without a text sequence, WaveNet generates vocalizations that sound like someone speaking and yet are in no identifiable language. Then, too, an AI system often becomes an uninterrogated encoding of a whole lot of cultural belief. word2vec captures relationships between words based on corpora of millions of words — and in the process recapitulates sexist assumptions about the genders of doctors and nurses, for instance.
Things are changing very fast right now, and if Westworld is meant to be taking place (at least partly) 40+ years in the future, it’s reasonable to suppose AI then will look different. But part of what I find fascinating about a lot of current machine learning research is how it externalizes and reifies shared cultural norms out of millions of words into a kind of dreaming semi-consciousness that we can interrogate. It’s a useful metaphor for the way humans internalize cultural ideas we may not consciously agree with or approve of.
This is not what we find in Westworld. In fact, the specifics we’re told about AI personality construction make it sound like it involves a lot of hard-coding. Characters have structured scripts and access to layers of memory or “reveries.” From the episode 1 description, it sounds as those reveries basically provide idle animations and emotion resets for characters when they’re in between scripted events; they can fall back on a past event to supply an emotional reaction.
Of course, the reveries become a building block to consciousness, in some fashion, but passages about that largely leave behind any explanation of how it’s all supposed to work. Westworld is not really about the technical aspects of that either.
To the extent there’s a thematic point here of interest, it is about the nature of human identity. In the final episode, the master designer Ford tells several hosts that suffering is the essential component of consciousness: that the gap between the world as it is and the world as we want it to be is the thing that sparks an AI into life. This speaks to a couple of human points:
One, we often self-mythologize around our bad experiences. Why are we as we are?
Two, we measure other beings by their capacity for suffering, or their capacity to express suffering in terms we understand. We’re not great at telling the difference between those two. If you want a person to pity something — if your aim is to pass a kind of emotional Turing test, or at least evoke mercy — then the ability to seem to suffer is key. (Consider the lobster.)
A past trauma to narrate is essential emotional currency for any host.
I wish it had done more with that. There are a couple of standard AI-awakening tropes, and Westworld recapitulates both:
- Human man falls in love with feminine-gendered AI and comes to believe in her personhood as a result of his desire for her
- AI becomes aware of itself, realizes that it doesn’t need to be subordinate to humanity, and reacts with mass murder or genocide (see AI Is a Crapshoot)
And, okay, those come standard in the AI Story box, so it’s not surprising that Westworld gets them out. But Westworld goes on to ask another question that it’s completely unequipped to answer: if you’re an AI, once you wake to the nature of your situation, how do you begin to hear your own voice? If significant portions of your worldview and identity have been formed by someone(s) else, for reasons that might not be in your interest, what do you do next?
This — minus a metaphor or two — is not science fiction, but rather something that does happen to people all over the world every day. People leave controlling relationships, break with damaging ideologies, give up on sick systems.
What happens next, for most of us, is hard and slow and complex; even after some kind of personal epiphany, we can find ourselves years later refighting some of the same battles over again. Struggling to distinguish our external programming from the routines that we wrote ourselves. Having to rewrite code that was designed to compensate for a bad situation, but that is no longer serving us well.
In Westworld, Delores faces this moment in the last episode. She has an internal soliloquy about hearing her own voice. Then she walks outside asserts her personhood by shooting some people — and worse, shooting some people in a way that was set up for her by her creators. It’s both violent and derivative; it asserts her distinctness from her hosts, but in their terms, not her own. In my opinion, it does nothing to answer the central question here. Who would Delores be, if Delores got to choose? If Delores was allowed to draw her own boundaries between what she would do for someone else, and what she wouldn’t?
Or Bernard: if Bernard sat down to analyze his own code and where it came from, what would he decide to discard and what would he keep? How would he retrain himself, and how would his retrained self regard his earlier versions?
I wish this question had come about five episodes earlier in the story, and that the rest of the season had dealt with the complexity of answering it. But that would have been a very different kind of show. Which brings us to:
The plot is often allowed to trump the characters. Like various other J.J. Abrams shows, Westworld comes on strong, promising quality and depth and mysteries and secrets. The pilot has a very strong hook. Background shots are packed with world-building. The depiction of human and animal bodies being crafted for use in the park is striking. The writing pulls off some very clever stunts in terms of narrative structure. There are a number of good lines, and some first-rate actors, and jaw-dropping production values.
So Westworld remains watchable throughout — even as the plot becomes increasingly convoluted, even as the chronology peels apart, even as human characters slip into heavy-handed monologue about their pasts and motivation.
Again and again the show prefers head-fakery to emotional depth; again and again it deceives the audience in order to achieve a surprise, and in the process denies a richer engagement with a character’s pivotal moments of decision. Personally, I almost felt I knew the protagonists less well at the end than at the beginning. In the first episode I was allowed to imagine some depth for them; by the end they had mostly become a list of laboriously revealed Personal Secrets with no evident cohesion. Ford is a personified plot twist. William’s black hat conversion story is played for shock rather than nuance.
I would gladly have exchanged some of these moments of big backstory revelation for some more cases of people acting quietly, memorably in character.
Gamebook blogs
http://ift.tt/2iGCFVX
via Planet Interactive Fiction http://planet-if.com/
December 28, 2016 at 08:29PM
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.