The past year or so has seen an incredible explosion about "artificial intelligence" (AI). Much of the discussion has focused on potential threats to humanity from machines - job losses, Terminator-style dystopian fantasies about the rise of SkyNet. Hackers.
Some of these threats seem more real than others (spoiler: the machines do not have to replicate human intelligence; they just have to be good enough. So yes. Most of the jobs of today are in fact at risk.)
But one thing I find intriguing is something not discussed.
What do we owe our creations.
A few years ago, the film Blade Runner was released. It is a movie based on the dystopian novel Do Androids Dream of Electric Sheep by the science fiction writer Philip K Dick.
The (brief) plot-line is that there is a race of androids called "replicants" that have been engineered with narrowly circumscribed super-human skills. For various reasons, the machines are banned from Earth, in part because (I suspect) of the apparent threats that they pose to people. The anti-hero of the movie is a bounty hunter named Decker (played in the film by Harrison Ford). It is the job of "blade runners" to eliminate any replicants that alight on the earth. In the novel (but not the movie), it is revealed that Decker himself is actually a replicant.
A question posed by the movie and increasingly now in our AI-evolving world is, is it moral for blade runners to hunt and kill replicants?
What do we owe our creations?
I’ve not seen the original Blade Runner in several years, but the question of whether it is “morally right” to kill the replicants - and the more existential question about what rights the replicants themselves have - is a terribly complex one in my mind’s eye.
Now, I’ve read some of Philip K Dick’s work. The question of what defines humanity itself is embedded in the story. I don’t recall clearly whether the replicants are in the film are revealed to be fully human, cyborgs (cybernetic organisms - part organic and part mechanical), or fully robots. In the book Do Androids Dream of Electric Sheep, the “replicants” are mechanical in nature.
It reminds me a bit as well of the movie AI: Artificial Intelligence. In that story, also of a dystopian future, human beings have created machines that increasingly simulate human beings. Like the replicants in Blade Runner, the machines are designed to fulfil specific “needs” of people. There are workers, sex robots, and of course, the protagonist of the story, David, who simulates a child for a couple whose natural-born human child is in a coma.
Both stories, to me, raise questions:
What defines humanity? Is it our physical, outward appearance? Our sentience? Our ability to feel (or simulate) emotions and empathy? Is it simply our DNA?
What responsibilities do we as creators owe those things that we create? I think about this from time to time in an eschatological sense - as a believing Christian, we believe that human beings were created by God himself in His image. Other monotheistic religions have similar systems of belief.
If God made man, what does God owe mankind?
The people in both Blade Runner and AI have created nearly perfect simulations of themselves, placed them on the Earth, and have responsibility for them. In some ways, this is similar to the relationship between God and man.
In a purely practical sense, most people accept that it is moral to defend the helpless, and in Blade Runner, Roy (the leader played in the film memorably by Rutger Hauer) has killed a number of people. Some, it could be argued, in defence; others in revenge (e.g., Tyrell, the head of the Tyrell Corporation, who make the replicants). Others for no reason really given (Sebastian, who befriends Roy and Pris). Roy is dangerous to people, some of whom have nothing to do with the creation and mistreatment of the replicants. In this sense, it is arguable that killing Roy is a protective measure, and thus is defensible.
In a larger sense, the issue of whether it is moral to create sentient humanoids simply to serve as slaves with intentionally short lifespans, and who are aware that they will die in approximately five years, must be grappled with.
So it’s a morally ambiguous situation that, beyond asking whether it is moral to kill the replicants, one should ask what responsibility we have to those things we create.
Like it or not, and John Searle to the side, we are getting close to the point where we, like God Himself, are making arguably sentient beings in our own image. So I suspect we should begin to at least think about what our responsibility to our creations might look like.