June 24, 2025

The Shadow of Humanity

"Dad!" I yell with exasperation. "Can you come take a look at this?"

Something spins on my computer screen, error codes displayed on a box. I click randomly to no avail, but suddenly the screen refreshes, the error is gone, and everything works perfectly. I spin around to find my dad has entered the room.

"Oh sure, now it's working! Just 'cause you're here," I say. He chuckles and retreats the way he came. Computers really like him, I think, as I continue on tinkering with whatever's on the screen.

Moments later, I might get called down by my mom to help her with some computer issue, only to have the exact same thing happen—everything is broken until I come to look, then the computer decides to shape up and act as though nothing had happened.

Maybe computers do enjoy the gaslighting, but long-time readers of CYBORG_ know that my personal stance on anthropomorphizing machines is a dangerous distraction, and probably has more to do with our brain's evolution than these systems having an actual "being" or consciousness. However, there is something that happens when the fingerprints of humanity are left on our machines, and that might bring out some surprising behavior from the cold, stoic technology.

Shadows

I don't know about you, but when I think about technology, I think of it in terms of stability, facts. It's unfeeling, unbiased, unemotional. It's something reliable. 1+1=2, no matter how many times you punch it into the calculator. If you get the wrong answer, it's because you, the human, messed up—garbage in, garbage out after all!

How is it, then, that we're seeing really good evidence that artificial intelligence has racial biases? Gender biases? If AI really is just another tool, why is it falling for the same old fallacies, stereotypes, and prejudices that we humans have? Does this not demonstrate a kind of consciousness? Or maybe it means that the human creators were pulling the wrong levers—passing their personal biases onto their creation?

I think there is more to it than simply pointing at the creators and saying they just hard-coded some problematic stuff into it (not that they don't have serious responsibility for what they have done, bad intentions or not). I also do not think this is actually unique to AI alone. Potentially, these biases have accompanied our tech for far longer than we've realized.

Consider this observation by Jun'ichirō Tanizaki:

Japanese music is, above all, a music of reticence, of atmosphere. When recorded or amplified...the greater part of its charm is lost...Most important of all are the pauses. Yet [the recordings] render these moments of silence utterly lifeless. And so we distort the arts themselves to curry favor for them with the machines.

Take a guess when this was written. 2000? 1990? Maybe 1970? Nope. He wrote this in his essay, In Praise of Shadows, back in 1933. Even more profound were the musings on technology and culture. Tanizaki mourns the losses felt in Eastern culture as Western technology was invading: photography, radio, trolleys, airplanes.

He notes that, left alone, the East may have continued to develop slowly—at their own pace—but that they would have gone only in a direction that suited us. The technology would have been developed in support of their values, their complexions, their landscape.

They would have been no borrowed gadgets, they would have been the tools of our own culture, suited to us.

To me, this was a paradigm shift. Suddenly, I can see how technology is not some set-apart thing. It absolutely does have bias, opinion, stereotypes, built in. Not necessarily due to ill-intent or some grand attempt at large-scale oppression (though that's entirely possible). But technology is inescapably linked to culture and the values of that culture will therefore permeate the tech for better or worse.

It's not because the technology is alive or conscious or self-aware. It is because technology is the shadow of humanity. We build it after our likeness, and it reflects that shaping back to us—recognizable if we decide to look at the shadow, but just as easily slipping into the environment as a subtle, unimposing thing.

Shadows can create beauty as Tanizaki argues in his essay, but they can also hide dangers. Perhaps especially for Westerners, shadows are uncomfortable. Maybe we're afraid of seeing what we've built into the darkness. If you can bear to look at the shadow, the question becomes: what do you see?