As the sole developer on my team at work, I've been called a lot of things: magician, wizard, king. Ironic as it may seem since my job is very heavily documented, technical, and (in a way) scientific, these mystical descriptions track with the subconscious culture of technology.
The first time I encountered the idea that there was a mystical or spiritual aspect to how we humans interact with, build, and explore technology was listening to this podcast: Can Myth Teach Us Anything About the Race to Build Artificial General Intelligence? With Josh Schrei. Turns out, there is a whole lot more out there and I've only started scratching the surface in the literature about myths and machines.
The unnecessary praise I've gotten at work has been gratifying. There's something to being called a wizard that is comforting in times when there is so much uncertainty about the future of work, because it tells me that my team does not understand what I do at all. If they don't understand it, they respect that I can "command" it so well. That comes with power imbalances, however, and puts a lot more responsibility on me to make sure that I am being honest and open about my work.
I've taken the stance that teaching, sharing, and presenting about technical things is how I can release some of the power imbalance that I've created (though many people still just don't care, which is fine). At least it demystifies on record what I do and why I need to do those things—I'm a magician that openly shares the tricks. That approach has been deeply criticized by actual magicians, and yet, like my favorite magician, Chris Ramsay, has said:
I think people should be exposed to more magic. I think 20-30 years down the line, people go to magic shows they'll be able to discern good magic from bad magic and I think they they have the right to know the difference...By keeping secrets from them...keeping them out of the loop, well, you're essentially depriving them of enjoying the art form the way that you enjoy it. They should be able to choose.
While being associated with magic is thrilling to me, since I love the world of magic—tricks, illusions, etc.—I had a disturbing experience just last week as the attribution finally escalated.
"Since Jess is a god, we can now do this..." my teammate said in a meeting.
I've finally reached my peak, I suppose. If the scale starts at magician, what can be above deity? To be clear, I don't care that his language was (obviously) hyperbolic nor am I offended in some kind of spiritual sense as though it were blasphemous. There's no concern of mine that my ego will be so inflated as to actually believe I'm on par with any form of deity, there's no concern that I will be worshipped.
So why, then, was I so unsettled by the comment?
The Problem Starts to Brew
Much of what I'm seeing right now as generative AI is being used everywhere—including in my own code—is an abstraction of knowledge and skill. It is easy to criticize AI and those who use it without any attempt to judge its output, but now I'm just as concerned with those who use AI, even with expertise. There is danger in using AI for research, for the creation of code, for advice, when it's so easy to use and to trust.
In a way, many knowledge workers are not in danger of being physically replaced by AI in the sense that AI will take our jobs from us, rather the threat right now seems to be that our intelligence will be replaced by artificial intelligence. We trade our actual intelligence for the artificial when we offload our critical thinking, our processes, and our hands-on work to the "robot." As Marshall McLuhan said way back in the 1960s:
"We are all robots when uncritically involved with our technologies."
Can the same be said about spiritual or mystical ideas like worshipping of a deity? Something like: "We are all robots when uncritically involved with our gods."
I think so. In fact, it's my personal experience that the culture of religion can, if not carefully guarded against, encourage a lack of critical thinking. This is not to say that it is the default state of "religion" to turn off our intelligence, nor does it mean that you cannot be both committed and thoughtful in religious settings. I'm speaking merely from my experience that I was trained to not ask certain questions; to take what I heard from leaders and males in general to be messages from God.
If something I saw or heard didn't line up with my values, but it came from the "right" sources (leaders and males), it meant that I needed to figure out the disconnect like a puzzle that had its pieces all mixed up. The whole issue was that I had to prove them right—I was trying to make the puzzle display the picture I needed it to be, rather than the picture that it was.
Perhaps we've been doing this with technology as well.
In my experience with technology in general, I started out completely in love. Surely all technological progress is true progress. Every improved system, every automation run, every pixel properly placed was improving everyone's lives, end of story. The world of code was so vast even fifteen years ago, that I had to lean on the advice of other developers since I had no footing of understanding to call my own.
Once again, cracks started to appear in my idealized perception of the tech world. Cyber bullying through social media; attention spans depleting while being harvested for ad revenue; difficulty connecting with people in real life; exploitation of labor in less affluent areas of the world; the perpetuation of conspiracy theories.
The paradox of technology being both monumentally good and bad cannot be buried. Religion has the exact same paradox having done exceptionally good things for humanity, as well as having perpetrated devastating things to humanity. The more flexible we become, the less these spheres of our lives shake us when we confront the uncomfortable parts of their realities. This flexibility, in my experience, is adopting the practice and skill of critical thinking.
Actually, there is precedent for this even with the AI problem. Microsoft published a study in 2025, The Impact of Generative AI on Critical Thinking, which states in the abstract:
"...higher confidence in GenAI is associated with less critical thinking, while higher self-confidence is associated with more critical thinking. Qualitatively, GenAI shifts the nature of critical thinking toward information verification, response integration, and task stewardship."
Why does it matter if we practice critical thinking, when AI will eventually take care of everything for us—it only gets smarter, after all? The paper continues:
"Used improperly, technologies can and do result in the deterioration of cognitive faculties that ought to be preserved. As Bainbridge [7] noted, a key irony of automation is that by mechanising routine tasks and leaving exception-handling to the human user, you deprive the user of the routine opportunities to practice their judgement and strengthen their cognitive musculature, leaving them atrophied and unprepared when the exceptions do arise."
Critical Cyborg
I am not a god, in case you wanted clarification. I am most disturbed by that label because of the unapproachability that it implies. In many ways, the concept of a deity is something / someone that is far away, untouchable, and bigger than us.
I'm afraid we don't examine how these religious terms and concepts are built into technology from the very foundation. It makes it so much easier for us to fall into traps of feeling or thinking that our technology is too great—too unreachable—that we have no control of it. How can we control something like a god? We can't. If our technology or those who wield it become gods, then how can we safely interact with it or them? We can't.
Like the Reverend Mother says at the beginning of Frank Herbert's Dune:
"Once men turned their thinking over to machines, in the hope that this would set them free, but that only permitted other men with machines to enslave them. 'Thou shalt not make a machine in the likeness of a man's mind'...but what the OC Bible should have said is: Thou shalt not make a machine to counterfeit a human mind."
Herbert's fictional universe is in a post-AI era. He skips over how humanity solved the issue of "men [turning] their thinking over to machines," but the artifacts remain that indicate that humans took back their thinking. It's chilling to be in a time where AI is no longer science fiction, but something that is actually posing a threat to our actual intelligence, agency, and autonomy.
I think that religious language in technological settings is a warning sign. I think it indicates that the hidden undertones are now bubbling to the surface, as though technology is finally becoming too out-of-reach, too big, too unwieldy, that we can't help but start calling it like we see it.
Although I will continue to act like a magician who shares the tricks, I'm realizing that it's not enough. Access to good information is not enough, because there are other players that are attempting to enslave "other men with machines." I don't have the answer, but I do know that it is more important than ever to work on critical thinking skills. Approaching the world with curiosity, rather than with arrogance. Focusing on humanity, rather than on profits.
We can't afford to check out of our lives. Let's not become robots by accepting technology the way it is, as though it couldn't be some other way. Let's abandon the dogma that any technology is good technology.
"The greatest of America's homegrown religions—greater than Jehovah's Witnesses, greater than the Church of Jesus Christ of Latter-Day Saints, greater even than Scientology—is the religion of technology."
—Utopia Is Creepy by Nicholas Carr