It's a dark, cold room with a table and a fire. The hostage named Caesar accuses the Colonel, who is standing at the table near the fireplace, of having no mercy. The Colonel looks over his shoulder at the ape who has scrutinized his actions. Eventually he launches into his diatribe:
“Mercy. You have any idea what mercy would do to us?” He asks Caesar.
“You're much stronger than we are. You're smart as hell. No matter what you say, you'd eventually replace us. That's the law of nature. The irony is we created you. We tried to defy nature, bend it to our will. Nature has been punishing us for our arrogance ever since.”
This moment pulled me out of the movie and into reality. Is this just War for the Planet of the Apes? Or is this a human insecurity in the face of our own technology, arrogance, and existence?
The AI Question
Ever since the rapid push for AI adoption after ChatGPT's massive and unforeseen success, the question of AI has jumped down our throats. Before, the question could be laughed off as part of silly sci-fi fantasies for nerds to discuss. It's not necessarily a question of whether AI is sentient, and it's not about whether it can do our jobs well enough.
However, these surface-level questions are much easier to distract ourselves with, because there is the possibility of solving for it. We can create regulations to help mitigate unfair or discriminatory working conditions (theoretically, anyway). We have also dealt with AI for a while and it has not come near sentience—it's not alive, it doesn't have feelings. At best it's an illusion helped by our own cognitive biases.
Consider what Alf Hornborg writes in the research article Objects Don't Have Desires (1):
“The aspiration of cybernetics to create artificial life has deflected our attention from the skewed political and distributive conditions of technological progress. Blurring the boundary between life and technological nonlife has obscured the widening economic gap between the victims and the beneficiaries of accumulation. As ideologies are prone to do, the image of technology as artificial life mystifies social inequalities by representing them as natural.
“The traditional distinction between nature and artifice goes back to Aristotle. In this tradition, technological artifacts have been understood as generated by the external intentions of their human makers, whereas living organisms are animated by inner purposes that derive from nature, not from humans.”
Profoundly, Hornborg sums it up: ...Modernity not only transforms nature and humans into instruments, but simultaneously naturalizes technology.
We can get so wrapped up in the threat of replacement (which is a valid concern) that we forget about the real, imminent threat to humanity or safety or health, especially affecting our most vulnerable populations. Our own flesh and blood are suffering, and we're worried about whether the chatbot actually knows what it is?
Maybe the AI illusion is too distracting to get a sense for the deeper question we're looking for. Let's leave the smoke and mirrors for a more encompassing theme:
The Technology Question
From the broader strokes of history, technology has accompanied us for tens of thousands of years. Everything from fire to language, tools to weapons. We have stories about our technology—how we received our knowledge from the gods, like Prometheus, or how special weapons could be forged with almost-sentient agency, like the One Ring from The Lord of the Rings.
Sometimes these stories teach us to look at our technology with suspicion: is there a ghost in the machine? Some ulterior motive? Or perhaps the company or other system behind the technology is threatening to our rights, privacy, and humanity (totally valid in the age of social media)?
Sometimes these stories elevate humanity to level with the gods (or even transcend them) as we pull farther away from the need to survive directly in nature.
“...now that we have technologized our environment and isolated the self within a scientific frame of mind, we no longer turn to nature to echo our state. Now we catch our reflections, even our spirits, in the movements and mentations of machines.”
—TechGnosis, by Erik Davis, pg 155
These stories can also place a small seed of terror in the back of our minds (if it's not already there via other means). I suspect many of us have a sense of the conflict we have with nature, even if it's not overt or if we don't acknowledge it very often.
Like the Colonel said in War for the Planet of the Apes:
“We tried to defy nature, bend it to our will. Nature has been punishing us for our arrogance ever since.”
Is our question about how we played a part, whether directly or indirectly, in devastating effects on the environment? Have we offended Nature itself? Will Nature retaliate at some point to bring things back into balance?
The Humanity Question
If the question isn't about whether we can create tools that can be agents unto themselves (life where there previously was none), nor about whether we will be replaced by these creations in an act of nature or some other means, then what are we looking at here? What's beneath the surface of all of these other, important questions?
I wonder if it's so basic and fundamental, that we breeze past it without much thought: What is humanity?
Not just what is it, but what do we want it to be?
Is our purpose really to exert dominance over other species and systems that nature has designed to be whole and complete?
I think the quote from War for the Planet of the Apes reveals our human insecurity that we really are fragile, despite all of our progress, all of our technology, all of our advancement, all of the things that make us human. Despite all of that, we’re still suspicious of the things we create, as though they might rise up against us—dominate us.
I think we're scared of how vulnerable we really are. There are no guarantees that protect us from tragedy: disease, death, disaster, destruction. Our technology cannot save us from ourselves, nor from calamity. Even if it did, we would still be scared of that very technology, wondering when the curtains could fall and the truth is revealed.
We may be even more susceptible to these fears because of our most recent brush with pandemics:
The [COVID-19] pandemic appears to be changing how humans see their place in the world... It has reminded us of Darwin's most profound insight—what has been referred to as “the darkest of his truths, well known and persistently forgotten”—that humans are just another species of animal and, like everything else in the animal kingdom, we are vulnerable to the threat posed by pathogens. If we Homo sapiens don't strive to live in balance with the other living things on our planet, we face a very bleak future.
—Pathogenesis, by Jonathan Kennedy, pg 228
Cyborg
Though it may not seem it from my writing here, this exploration has brought me a feeling of deep compassion as well as sadness. When I first wrote about the dangers of AI, especially concerning our cognitive biases almost a year ago, I was angry. I felt deceived by my own brain, taken advantage of by companies, and stuck in an infinite loop of traps.
The year since, I've really tried to understand the major systems at play, at least in my country and my particular culture. Anger has been replaced with sadness as I contemplate what we've built and how we perpetuate harm to our own species, let alone others in this vast world.
But if we take a look at the Colonel's quote one more time, there's a subtle possibility hidden at the very start:
“Mercy. You have any idea what mercy would do to us?” He asks.
While the script goes on with a negative view of the concept of “mercy,” we have not yet explored the other possibility. What if our humanity could be, “mercy”? What if that was our distinguishing feature? What if that was the way out of systems that take and take and take with no compassion or balance?
George Thompson put it beautifully in one of his videos: What if nature were something to belong to, rather than to take from?
With all of the systems we have created or to which we are a part, the natural world takes precedence. Maybe the question of our lives is more about how we can integrate with our Mother System, rather than childishly attempting to dominate and stand apart?
Works Cited
(1) Objects Don't Have Desires: Toward an Anthropology of Technology beyond Anthropomorphism, American Anthropologist, Vol. 123, No. 4, pp. 758, 2021, Alf Hornborg.