When you're feeling love's unfair
You just ask the lonely
When you're lost in deep despair
You just ask the lonely
Often in my imagination as a kid, I would set myself in a barren landscape. No one for many miles, far from civilization. Being an introverted queer kid, my idea of paradise was often this retreat where no people lived. A place where everything was up to me: I would have infinite time for my hobbies, infinite space to wander and explore, and infinite peace from bullies as well as those who preferred shallow, inauthentic relationships.
While this fantasy is impractical and dangerous if ever realized, I still find that isolation is my brain's preference for imagining a peaceful existence. For many, if not most people, however, this would be a torturous hell, even as a fantasy. We are social creatures and we need other humans to thrive (much as I like to tell myself I'm different, I'm not in any way). Isolation is punishment. In the U.S., we use it as one of the highest forms of penal imprisonment.
Since the advent of solitary confinement in the late 18th century, reports have documented the deleterious effects of living in total social isolation...Long-term periods of isolation have been found to significantly affect neurological and psychological health, and this is especially harmful for young people as the human brain continues to develop past age 20, specifically in areas of the brain associated with behavioral control, risk assessment, and planning.
Despite the controversial nature of this punishment tactic, isolation is still used currently. Imagine, then, if we continued this practice into a future state, where perhaps we have tech companies breaching the newest form of the privatized prison market: isolation in space.
This is what the seventh episode of the classic anthology, The Twilight Zone, examines. More specifically, it involves one more step on the technological rung of this isolation ladder: could a robot help the problem that was created by extreme isolation?
The Lonely
James A. Corry is sentenced to 50 years of solitary confinement on an asteroid 9 million miles from Earth. His only form of human contact is when a space-crew delivers supplies once every few months. Corry is particularly grateful whenever Captain Allenby comes with the crew. Allenby has a soft spot for Corry's situation and often brings him things to stay entertained, like car parts, allowing Corry to build a car from the ground up. This time, Allenby comes with bad news that even after 4 years on the asteroid, Corry's appeals case is still not looking good.
I'm not a murderer! I killed in self-defense!
Corry protests to Allenby, who can't do anything about it...not the legal process, anyway.
Allenby tells his crew to bring a large box from their spaceship over to Corry's hut. Allenby tells Corry to do him a favor and wait until he and the crew are out of sight, because the contents of the box could get Allenby in trouble—no car parts this time. After they've delivered the box and start boarding their ship, a crewman can't help but ask the Captain what's in the box.
I don't know...Maybe it's just an illusion. Maybe it's salvation,
Allenby replies vaguely.
Back at the hut, Corry reads the instructions that came with the box. It says something to the effect of:
Congratulations, you're now the possessor of a robot...for all intents and purposes, this creature is a woman.
All he has to do is open the lid and the air exposure will get the robot up and running. We cut to see a shot of a woman in what my eyes see as an unflattering dress that feels a little too much like an old-timey religiously-enforced style. That's probably my personal bias coming out, especially after having just heard possessor of a robot
which for all intents and purposes...is a woman,
which is deeply problematic and evokes that Biblical woman-as-property vibe.
What I find so interesting about this episode is precisely these details, because it reveals the thinking of the time when this was made: 1959. Why this outfit? Why this pose; the hair? Why the human actor playing robot without any visual indicators that she is a robot? Why a woman?
There's a lot to unpack and I'm starting a deep-dive into the research of gender and technology—spoiler: it's deep and more connected than it seems. So for now, I'll leave some of these questions for you to consider and we'll focus on the isolation aspect of this episode.
Corry is unjustly isolated, Allenby feels bad and risks his career to bring him a woman-robot, hoping to help his friend combat the psychological harm of this punishment—the car project hadn't been enough to really help after all. Corry actually first appears disgusted with the thing: pushes her around, says she's not real.
I'm sick of being mocked by the memory of women. And that's all you are.
After the awkward start to this relationship with Alicia (the robot), Corry finds himself 11 months later in a completely different state than the stressed, anxious, and eager condition we met him in.
It's difficult to write down what has been the sum total of this very strange and bizarre relationship.
Is it man and woman? Or man and machine? I don't really know myself. But there are times when I do know that Alicia is simply an extension of me. I hear my words coming from her, my emotions, the things that she had learned to love are things that I love.
I'm not lonely anymore. Each day can now be lived with. I love Alicia, and nothing else matters.
I can't help but see parallels with this reflection and the stories of "AI psychosis," where people fall in love with AI chat bots or AI avatars. I'm not about to provide any kind of medical or psychological advice, but it seems like, especially with sycophantic AI models, people are falling in love with their reflections. Just like Alicia learns to love the things Corry loves (playing checkers, for example), AI models (or at least the systems that deploy the models) are trying to learn from your behavior and get to know you—your preferences, your predictable behavior. When models can use data about you, they can provide answers or experiences that you're more likely to like. This feels like a relationship, where the other "person" is meshing with you; they like what you like—nevermind that they don't have any real preferences themselves and are simply feeding back to you what you like.
Despite the apparent shallowness of Corry's explanation for his love of Alicia, he claims that his loneliness is gone. Is that not a sign that the love is genuine? Has the robot successfully filled the void that isolation rifted?
Not So Fast
Allenby shows up one day with news of a pardon. Corry is a free man and he can leave his asteroidal prison immediately. In fact, the ship only has 20 minutes to load up and take off since this was sudden and unplanned initially. Corry jumps from happy to disturbed when the Captain elaborates that Corry can bring only 15 pounds of stuff and the robot counts as weight, not as a person.
Faced with the fact that he can't bring Alicia, he freaks out and shows Alicia to the crew, trying to have her convince them that she's a real woman. Alicia stands there, hardly getting any words out when Allenby pulls out a gun and shoots her, which rather quickly convinces Corry to abandon the asteroid and the robot he had supposedly fallen in love with. No tears are shed, no moment of silence, or even displays of anguish. Corry acts like he just lost a toy that really didn't matter and boards the ship to go home.
What?! After all this poetic narration about how Alicia helped him through one of the worst punishments we can mete out on a human, he just shrugs and leaves the moment she's destroyed? Everyone that has simply watched Cast Away with Tom Hanks is more upset over seeing Wilson, the volleyball, float away than Corry is about the robot he loved getting shot in the face.
This surely deviates from real-world cases of AI love (maybe?), but it highlights one of the disorienting parts of AI relationships: they are not substantial. That doesn't mean that they can't feel substantial—Allenby himself suggested the robot was probably just an illusion—but when the AI is lost, we aren't really harmed. We can always boot up a new one, start over with a similar model, or just move on. When a person dies, they are lost to us; there is a permanence and a substantial, visceral pain. Even darker, when an AI loses its human, it is wholly unaffected.
This is what is deeply disturbing to me as I think about the problem of loneliness and what I consider to be inhumane technological "solutions" to loneliness. The power difference is terrifying. We have the illusion that we are in control—that's how machines work, after all!—and yet our connection to the machine, no matter how earnest, no matter how much it feels like connection, can overpower us with only consequences to the human involved. Hopefully those consequences are mild, but in some heart-breaking cases, it's catastrophic and irreparable.
Cyborg
Allenby's final advice to Corry is that, "All you're leaving behind is loneliness."
Although they're standing over the body with the exploded mechanics where a face used to be, this advice seems to be referring to the entire experience of Corry's solitary confinement. Including, of course, the robot that only superficially quelled the loneliness.
Considering that at least in the story, technology was part of the harm, since imprisoning someone on an asteroid is a scale impossible to match without technology, I wonder if we can draw one more parallel to the real world. Some technology, as we've often explored in this newsletter, is isolating by design. Social media and other deeply addictive virtual experiences are like our own personal shuttle to an asteroid where only our interests occupy the space around us. We know this is a problem, but instead of fixing the spaceship route to isolation, we put AI on the asteroid.
I've asked before and I'll ask again: does a human problem created by technology really get solved by more technology?
Whenever a company comes in claiming to solve loneliness, that's all the warning I need to know that the product will be dangerous if not inhumane. Loneliness does exist in our society, it does hurt people, we do need to help, but this is a human connection problem, not a market-penetration problem. If you really want to help, ask the lonely, talk to the lonely, involve the lonely in your real life. Maybe we'll remember what we used to have before and we can leave the loneliness behind.
You've got some fascination
With your high expectations
This love is your obsession
Your heart, your prized possession
Let down your defenses
Open up to the one who caresAs you search the embers
Think what you've had, remember
Hang on, no don't let go now
You know, with every heartbeat, we learn
Nothing comes easy
Hang on, ask the lonely