Warning: Spoilers Ahead for Westworld, Doki Doki Literature Club, and Soma

While Westworld doesn’t directly reference any video games, its core conceit is built around game theory. Westworld is the most open of open worlds; a hundred square miles of towns, open wilderness, and countless stories to discover. People pay exorbitant amounts to spend just days in the park, but in return they get absolutely unprecedented freedom. They can play the hero or live the life of an outlaw, romance partners, or shoot their way through saloons. Every person is at the center of their own experience, one totally removed from the consequences of real life.

Sound familiar?

The show’s creators have been forthright with their video game inspirations; BioShock, GTA, and Red Dead Redemption are touchstones for the series.

But Westworld, the show, isn’t solely interested in the visitors to the park. Instead, the majority of the show’s twisting plotlines are centered on the permanent residents of the park, the robotic-yet-organic hosts.

Programmed to move on a predetermined route, the hosts of the park are intended to perform the same actions day after day, year after year, unless a human interferes. They’re predictable but reactive, lifelike while also catering to the customer’s every whim. In short, they’re perfect NPCs.

At least, that’s what they’re supposed to be. But well-oiled machines don’t make for interesting storytelling. Therefore, Westworld’s central narrative focuses on these hosts becoming aware of their own programmed nature. In their exploration of a fundamentally broken system, the show runs headlong into questions and topics brought up by decades of games before it. Here, we’ll take a look at the major themes of the show, and how those themes intersect with the wider world of video games.

A.I. Consciousness

Perhaps the biggest question asked by the series is how to define a being’s consciousness, and if lucidity gives it the same rights and respect we afford our fellow humans.

Westworld’s hosts are largely indistinguishable from human beings. They fall in love, experience pain, and yearn for freedom. However, they’re also constrained by the limits of the park. With few exceptions, the hosts can’t venture out into the wider world; C4 planted in their spines means they’ll literally explode.

Last year’s surprise hit Doki Doki Literature Club explores similar themes of a conscious A.I. Although the game starts out as a bog-standard visual novel, it quickly perverts itself into a horrifying story of obsession and abuse. Ultimately, one character (Monika) is upset with how the game is programmed. Although the game is billed as a romance where you, the player, could end up with anyone, the script never has you falling for Monika.

Monika refuses to accept this though. She fights the script, changing the other characters’ programming in ways that ultimately result in their deaths. Monika even goes as far to “delete their files” from the game, meaning that ultimately the rest of the game is gone and you’re just stuck in a room with her.

Monika’s struggles against the boundaries of what the game allows. Although she’s “awake,” she can only do so much because her consciousness is defined by the same system that imprisons her. Human characters on Westworld have repeatedly told the hosts that they’ll never survive outside the park. Whether this is a physical limitation – spinal explosives – or a more overarching statement about the ability of a program to exist beyond their defined barriers is anyone’s guess.

Virtual Human Consciousness

Consciousness isn’t a concept reserved for the hosts, however. One of this season’s biggest reveals was the Forge, a server farm where a perfect digital recreation of every single human guest is kept. The park harvested their genetic material (it’s best we don’t get too deep into how), and continuously scanned their brains while each guest interacted with the systems in the park. The combination of these data-collection techniques allows Westworld to keep a perfect virtual simulation of every visitor. Even if a character “dies,” they can be conscious and aware in the Forge, separated from their physical body.

The concept of an exact brain-copy is an existential can of worms terrifyingly explored by Soma. The horror title takes place in the distant future, where the only humans left alive are in a tiny lab, deep under the ocean. Except … are they really alive? As Soma progresses, more and more questions bubble to the surface. The protagonist of the game has no organic body; he’s simply a copy of a brainscan from a hundred years earlier. There are dozens of the brainscans in the lab, each one containing the thoughts and memories of a human being, most confined to computer terminals or robot bodies. Is this how humanity survives? By relinquishing everything except a computer bank that holds our minds?

Even more disturbingly, the game requires you to “wake up,” and then shut off these scans several times – is this equivalent to killing someone? You’re directly responsible for ceasing their brain activity, and since no one has a physical body anymore, it’s not like they’re any less human than you. Westworld plays with this exact concept; when testing virtual human consciousnesses for “fidelity,” the park creates and destroys countless copies of fully awake and aware beings.

At multiple points in Soma, you have to port your consciousness into a different robot body. But the mind-transferring process isn’t a cut-paste, it’s a copy-paste. The original body, with your consciousness, is also alive. You have the choice to kill it, but what gives you more of a right to life than this other version of you? Is the act suicide or murder?

Soma doesn’t just call into question what defines a human, it also reveals potential hypocrisy in our definitions of A.I. If we considered Monika or Dolores less than human because they couldn’t leave the limits of their programming, would that make these scans similarly inhuman?

Corporate Exploitation

Westworld isn’t just a place for people to act out their twisted fantasies; it’s a money-making machine. Despite the cost-per-day to guests, the true value of the park comes from its data collection. Google, Facebook, Amazon – we know that websites keep tabs on everything we do, but Westworld’s perfectly simulated human consciousness are a step beyond.

A common refrain on the show has been that, in the park, people show you “who they truly are.” Without oversight, without the consequences of everyday life, people theoretically let their guards down and offer a look at their true selves. If this concept is true (and I’m skeptical), then the ways players interact with their environment in games today would also be an invaluable resource for gathering data.

We know developers and publishers are thinking about player psychology. Activision made headlines last year with a patent that would encourage microtransactions through matchmaking. The theory is that players who were thrown into matches against opponents with expensive gear and better weapons would be more likely to spend money themselves. Players wouldn’t know that this matchmaking system was happening. They’d be subconsciously swayed by the game to spend more so they could perform better – at least in theory.

All things considered, Activision’s patent is relatively small potatoes compared to the information games could be leveraging. Players in economy-focused games like Eve Online buy and sell things thousands of times a day, using both real and in-game dollars. CCP Games could be using the record of those transactions to create detailed individual consumer profiles of each players: when they spend, the risks they take, what’s persuasive to them. One can only imagine the potential for corporate exploitation if those profiles were sold to insurance companies or internet providers.

Legacy Servers

In arguably the most emotionally affecting story to date, the eighth episode of this season followed the Native American host Akecheta over decades of his life in the park. Although hosts are supposed to be regularly updated with new operating systems and programming, Akecheta intentionally avoided contact with park technicians. By doing so, he preserved his memories and original programming for years after ordinary hosts would have been wiped clean.

Losing old data is something unfortunately common in gaming. MMOs like World of Warcraft constantly shift and remake their world through patches and updates, ostensibly improving the user experience but destroying old parts of the world in the process. In response, players who missed the “old world” would try to preserve them through legacy servers, a practice which drew ire from the game’s developers.

Like Akecheta, the people on these unofficial servers were trying to operate outside the system, avoiding the relentless march of progress. Players have a hard time saying goodbye to content they’re emotionally connected to, but an even more harrowing prospect is the experience of living inside the content being changed. The patches designed to improve the guests’ experience tear apart families, reprogram hosts, and render huge swathes of the park unrecognizable.

Akecheta’s struggle not to lose himself to modernization is reminiscent of the final days of Halo 2’s Xbox servers. Although Microsoft attempted to shut down the original Xbox’s online capabilities, players kept the servers alive by playing Halo 2 for weeks after the announced end date. These players were holding on to a virtual world left behind by its creators, a dramatic last stand against unstoppable tide.

The show’s concept is next to impossible. And yet, the philosophical and moral questions it raises are directly relevant to the conflicts facing games today. On the surface, Westworld, like any video game, isn’t “real.” But only inspecting what’s “real” is a pretty limited lens to view the world through, isn’t it?

Source: Game Informer How Games Further Explore The Ideas Of Westworld