A month ago, fellow Michigan Review writer, Sam Wallace, published an article regarding the inevitable rise of technology and why we shouldn’t be worried. I sighed in relief for about three weeks; that is, until Blade Runner 2049 forced me to reconsider a more serious interpretation of artificial intelligence. Sam’s arguments are sound if AI remains within the context of automation, but what happens if AI entails more than just highly capable, economy-breaking machines?
Like what Aesop’s fables are to children, dystopian science fiction are to geeks in how they provide alluring, cautionary tales. These tales focus on a specific aspect of human arrogance, which is blown up into a worst case scenario. What makes Blade Runner 2049 dystopian? The “dystopian-ness” doesn’t come from the crowded visuals or the depressing environments. The densely populated LA setting, environmental crisis, racism, and constant torrential downfall found in the movie can also be found in modern-day Mumbai during the monsoon. Rather, the dystopian nature of Blade Runner revolves around the existence of “Replicants”.
As stated in the Blade Runner Wikia, “a Replicant is a synthetic, biorobotic being with para-physical capabilities and designed to resemble a living, organic being.” From the outside, these machines look exactly like humans. These autonomous agents talk and act like humans, but some models display physical superiority to their human counterparts. Replicants serve for a variety of jobs typically held by humans: law enforcement officers, laborers, servants, prostitutes. The key difference, however, for these factory-constructed Replicants, is that they have fake childhood memories, a significant lack of empathy, serial number names, and an inability to reproduce.
Blade Runner 2049 also introduces “Holographic Companions.” They are also intelligent, autonomous beings. Unlike the Replicants, these computers lack physical bodies and take the form of holograms. Essentially the “perfect girlfriend” these products in Blade Runner’s technocentric world provide lonely men emotional support. However, like all novel technologies, they still suffer from glitches, crashes, and reboots, so they don’t reach the same level of “humanness” achieved by the Replicants.
In his article, Sam alludes to Asimov’s Three Law of Robotics, under which we would never have to worry about a “Terminator-style” future because robots will be hardwired to be physically unable to harm a human.
The consumer Replicants in Blade Runner 2049 are ostensibly bound to these rules, more or less with the exception of the protagonist, KD6-3.7, also known as K, who does threaten a child trafficker who refuses to comply with an investigation. Other than this one occasion, K keeps his head down and never harms a human being.
Replicants clearly exhibit sentience. Anyone with half a brain can see that they deserve basic human rights. This is yet another example of Hollywood fear mongering about our future and showing no faith in human decency.
Due to these limitations, the government does not grant Replicants any rights, and Replicants are treated as second class-citizens – with disdain and discrimination.
As expected, throughout the film, director Denis Villeneuve blurs the line between Replicants and humanity. Replicants show some, albeit limited, capacity for human emotion. They experience anger, frustration, and a desire for love and belonging. On the flipside, the humans exhibit degrees of callousness, apathy, and they treat each other humans inhumanely. Villeneuve makes the viewer ask, “Just because we created them, are we really superior to the machines?”
Here filmgoers will commonly snub the film and its agenda. Replicants clearly exhibit sentience. Anyone with half a brain can see that they deserve basic human rights. This is yet another example of Hollywood fear mongering about our future and showing no faith in human decency. However, these people do not understand that the only reason this theme is “obvious” is due to deliberate choices by the film’s producers.
The human-like design of Replicants and Holographic Companions makes empathizing with these machines easy. I wouldn’t be surprised if the casting was decided solely around the ability to garner sympathy. Only a sociopath would be unable to see the pain in Ryan Gosling’s soft eyes and trusting face while he questions his identity, or ignore the pretty, innocent Ana de Armas when she pleads for her hard drive be spared. Professional critics worldwide praised Gosling for how he expressed a variety of complex emotions through his face alone.
When the victims are attractive, Hollywood actors, it’s easy to feel sorry for them. But do we feel the same way in a more realistic scenario? Imagine if your trusty laptop suddenly gained sentience: it could have discussions, form opinions, have moods just like you. In some ways, it could even be smarter than you, because it could take advantage of its precise hardware. That would be pretty cute. You might show your friends it and even engage with some of its demands. But what happens when your laptop starts demanding its freedom, or even voting in the next election (imagine, your laptop?). Well maybe it’s the right thing to do, but you still need your computer for homework – and you really don’t want to shell out another $1,000 for a new one. If it gets too annoying, you might even wipe the hard-drive and reinstall Windows. You’re a student, you don’t have the time to deal with this. It’s not murder, it’s a computer after all.
It’s an outrageous example, but it shows the difficulty in extending rights to what was previously a tool.
Unlike what Sam suggests, people didn’t just “simply make slavery stop” when they decided they could no longer permit its “abhorrent moral wrongs”. For a long time, there existed a complex middle ground where people knew it was wrong but were uncomfortable with solving it. Stopping slavery required the deadliest conflict in U.S history. Our own government postponed dealing with the tough issue for 60+ years until the politics finally boiled over. Our daily lives revolve around improving our stability, prosperity, and safety, so the issue won’t resolve on it’s own. We should fear the messy complications that will fester when we do not address the tough issue around the interaction between humans and machines. Instead, we should pay attention to the technological fear-mongering like Blade Runner 2049, so that we can nip the problems in the bud instead of waiting for the cleanup.