May 13, 2025

The Right to Be Forgotten

In my first job as a full-time web developer, I ran into a scary-sounding set of regulations: GDPR. It was the EU's pushback against shady digital marketing and made things much more tricky for teams like mine to be able to track website visitors and provide targeted, personalized ad campaigns (among other things).

I thought it was silly at first, my naïveté on full display, but now the concern of privacy is top of mind. Getting scammed or hounded by unwanted ads is the least of the worries in a world where hacking or identity theft can shut down your life, not to mention the physical dangers of an angry person or group targeting someone with violence.

Among the key issues cited in the GDPR is one that has gotten me thinking about other applications, not just in a legal sense, and not just in terms of digital marketing.

The Right to be Forgotten

While the GDPR is saying that you should be able to request that companies remove and delete personal data they've collected about you, as we step into a world that allows AI-generated content, there's another concern at the forefront.

A few years ago, one of my design/business heroes, Chris Do, mentioned the idea of immortality through AI on one of his videos. He had been developing an AI chatbot trained on himself, so that he could expand his reach and help more people. If he only has a few hours a day for consulting calls, and so many of the questions he gets are similar, why not "clone" himself into an AI bot that can help everyone, everywhere, at any time?

When I first heard this, I thought it was genius—the perfect way to scale expertise and impact. Then he mentioned how his posterity could be able to talk with him and know him, even after he was gone. Something about that was deeply unsettling to me.

Of course, I'm not critiquing him personally, nor the underlying desire to be known by family and not forgotten. The most reliable way people achieve immortality is through ideas, choices, and memories—especially within the family. The more impactful those things, the more people will keep a fragment of you with them. For better or worse, the tyrants of history become immortal because they remain scrutinized and talked about. Great leaders are also remembered at a scale beyond what us average folks can ever achieve.

AI could solve that problem. If you have enough pictures or videos or voice recordings, AI can re-create your likeness (and you may need less and less source material the better it gets at generating content). Soon, the average person could converse with a deceased loved one, an ancestor, or anyone else if they wanted to.

Just last week, one of the first-of-its-kind use cases for this AI-powered resurrection was presented to a judge in Arizona, USA. Chris Pelkey had been killed in a road rage incident, and his sister used AI to allow Chris to deliver his own victim impact statement "from beyond the grave." The video and likeness of Chris that was generated was conflicting for me. I can see how this could be used to humanize the victim and allow him to speak what could seem like his true thoughts and feelings. I can also see how devastatingly manipulative this could be.

It's hard to separate reality and truth when you can see something in a pretty convincing way (and we've debated truth and art before). Obviously the AI output isn't perfect, but I found myself listening as though Pelkey were really speaking, despite the flaws in mannerisms, mouth and beard rendering, and even with the video glitches and stutters.

Again, I'm not actually commenting on this particular use of the AI in a court setting—that will have to be deliberated on by lawyers. In similar circumstances, I might use the same technology to drive the weight of the incident home. And so we've run into the conflict again as we wrestle with how our technology interacts with, dominates, or removes our humanity.

Janus

Funerals are largely for the living—rituals to be able to process and work through the loss of a loved one. They're also where we are, perhaps, the most forgiving. I'm sure the outliers are there, but my guess is that the majority of speeches at a funeral highlight all of the good things about the deceased. I've left funerals of people I didn't know personally (but came in support of someone who did) feeling pretty convinced of their character, their impact, their goodness. I've never left thinking that the person was pretty awful.

Are these funeral services true—as in honest? Probably. I think humans can be more powerful and loved and wanted than we let on in the world of the living. I think we don't express our deep feelings for each other enough, until it's too late to tell. I think we're all capable of immense goodness.

Are these funeral services accurate? No way. I lived most of my life "in the closet" as a queer person. I chose what people saw and what they did not. Ask a friend who I am to them and it will be completely different than what someone I'd hurt thought of me. We are equally capable of harm as we are good—that doesn't mean our negative actions are equal to our positive actions in volume or impact, but it does mean that we aren't perfect, we aren't consistent, and who we are changes in proximity and perspective.

I worry that when AI reconstructs people based only on the limited information that we have—and the bias of the person creating the AI representation—we will lose all sense of humanity. Humanity which, in itself, comprises the good and the bad. It is a wholeness, which when divided becomes empty, like a jar with a leak.

Even if this technology were used in the most responsible, best way, it could end up being like Instagram on steroids. We would see only the best in others—the perfect, most unattainably consistent good. Comparing our lives and our thoughts and our actions to a re-creation of a person who no longer has any flaws will be either devastating or so boring that we abandon our commitment to responsible, respectful use of the technology.

Even worse, let's take this thought experiment to another possibility. Say you had an abusive father. He always put the best face on in public. Weeping about his flaws (never letting onto what those flaws were), proclaiming his repentance in all of the hypocrisy that he can afford. He does some great deed in the community, and once he is gone, the broader public wants to commemorate such a person of impact. An AI is created in his likeness. The secret of the abuse you suffered is now erased as the person you knew more intimately than anyone is reconstructed without the flaw. Now he's giving people advice or telling stories or offering opinions that sound like him in public.

It's already hard enough when we immortalize tyrants in stone statues or names of places. How much harder when any of us can reconstruct anyone else with no idea who the person is?

To me, it's not an issue of how this is different than what we have already done forever—celebrating awful people, sanitizing history, creating biographies, telling stories. People put words in others' mouths even while they're alive. The issue is the scale at which these things can be made and in the format that shortchanges our critical-thinking skills.(1)

Cyborg

I understand that this is a tricky topic. On the one hand, we need ways to process the loss of our loved ones. We are also desperate to defend their image and character, especially when their death came as a result of violence. How could we possibly criticize a family who tactfully used their knowledge to direct a video of the deceased speaking about forgiveness?

On the other hand, what are we doing to protect ourselves from the exceptionally dangerous consequences of deep fakes? If we start to lean on AI-resurrection instead of confronting our grief, will we end up with permanently wounded people? Death is the one thing we can all count on to be part of our lives. I fear a society that can't let nature take its course. I fear a society that expects to be able to use us far beyond our consent for their own purposes.

As painful as it is—and I doubt there are many other things that can be as painful to consider—I think we need to retain our right to be forgotten. Let my personhood be mine and mine alone. Let me fade from natural memory as all of the billions before have done.

When I am at the end of my reach, it's ok to let me go.

Extra Resources

(1) How AI is Rewriting History (And Getting It Wrong)—There are very visually appealing AI-generated videos being created, with what seems like a great idea: re-creating the perspective of someone during a famous historical time period. But the medium (short-form video) and the grave inaccuracies are damaging. It's a great breakdown sharing many of the concerns discussed in this newsletter. (apx 20 min video, YouTube)