Spend enough time with programmers and someone will eventually tell you this:
"There are only two hard things in Computer Science: cache invalidation and naming things." —Phil Karlton
In fact, it's a joke that's gotten saturated enough that we can throw this riff in, too:
"There's two hard problems in computer science: we only have one joke and it's not funny." —Phillip Scott Bowden
But, really, naming things is so hard and can cause some dangerous problems downstream.
Databases and You
Part of the reason naming things is difficult is because everything needs a name in programming. If you want to store some information in a database, you need a name for the information.
Some "names" might be randomly generated IDs, like kuf3h4kuhbwe78gyejdhg
but if you want to be able to perform some kind of logic, you’ll still have to put that in a human-readable form like a variable (which will have a name).
You can think of it like form field labels. When you fill out a form, you have a label that tells you what kind of information is expected.
The label is likely the name that the database ends up using to store your information.
This gets really tricky in medical scenarios, demographics, and other personal identification information.
Perhaps you’re a programmer tasked with setting up a check-in form for a doctor’s office. There are the seemingly easy fields: Name, phone number, birthdate. Then you need the sex of the person.
Should you provide two fields for “sex” and “gender” so that the doctor can be aware and respectful of a person’s identity? Do you only include “sex” because you’re not even thinking about queer identities? Should you provide an option for “intersex” in addition to “male” and “female” to account for the natural occurrences of indeterminate or “difficult to categorize” anatomy?
Naming things is exceptionally difficult, because ambiguity pervades all aspects of our lives, much as we try to make things binary or, at least, clearly defined.
The Fuzzy
The other reason naming things is so hard has to do with our cognition (how we think). Prototype Theory, developed by Eleanor Rosch, challenges the classical take on categorizing things, and helps explain why we run into trouble with our names.
According to the theory, our brains group things together based on their relationship or closeness to the prototypical thing—a "family resemblance" if you will. The prototype of something is built by our individual experiences with objects or concepts.
As a percussionist, I know what a marimba is, but many people assume that what I'm playing is a xylophone—because they look very similar and xylophone is a word more commonly known. However, due to my experience I can easily distinguish a xylophone from a marimba—they hold two different "prototypes" for me, while non-percussionists tend to have one single prototype for the "wooden instrument that looks like it has a piano-key layout."
The Classical Theory of categorization is the idea that we can clearly define things by “necessary and sufficient features.” We create boxes, put things in boxes, and they’re supposed to stay in those boxes, nice and organized.
This is where the “appeal to dictionary” fallacy I mentioned last week tends to get me.
“This is common for most words that we see used frequently. If we have a backlog of experiences hearing words being used in reference to certain concepts or items or entities, we just intuitively…generate a concept of what that thing is,” says Dr. Dan McClellan on his podcast, Data Over Dogma.
“A lot of people…what they’re going to do is just retreat into their mind and try to imagine the imagery; the conceptual field that is evoked from that word; and then try to describe or define that.”
“The problem is that’s entirely relative. It’s going to be different for every person because we all have different experiences.”
Definitions are always incomplete—they can get close, but they’ll always leave out something or include something that doesn’t always fit. To “define” something is an attempt to find the edge of where something stops being itself—but in the act of drawing boundaries around concepts or items or even people, there are always going to be things that don’t quite fit, or that challenge those boundaries.
Consider a chair for a moment.
What comes to mind? How do you know what a chair is? Maybe it looks something like this:
Classical Theory would prompt us to define a chair like this: it has four legs, a backrest, and is something we sit on.
What about this design?
Now we only have a backrest and it’s something that looks like we can sit on.
These two chair pictures were easily generated with AI. The first image was a simple prompt to generate a photo of a chair. The second one needed prompting to show a “unique chair” like one you might see in the MoMa (and the AI struggled to get me anything more unique in design than the one above).
Now, let's start playing with the boundaries even more:
This is the "S-Chair" from Tom Vaughan (credit: https://www.objectstudio.co.uk/oldgallery#/vii/). Does this challenge your definition of a chair?
Let's push it even farther with, "Knot," from the same studio:
Is it a chair? (I truly am not sure, though it appears to be able to function as a chair if I were so bold as to sit on it).
We just went through the Prototype Theory. We started with a prototypical chair, probably similar to one you might think of when hearing the word, “chair.”
Then we start playing with the boundaries—removing the “leg” requirement, changing the shape. Ultimately, a chair isn’t its legs or its backrest or even the seat. Artists and designers can push the boundaries of our chair-conceptualization and it makes us question the object. Is it a chair? Is it art? Is it both? Is it neither?
How do you set up a database to properly categorize something like Knot? How do you reconcile one person's conceptualization of a chair with another person's entirely different conceptualization?
The Collision
The fuzziness of our named thing can be frustrating and even dangerous. What we think is clear communication might be wildly mistaken by someone else who is bringing a different set of experiences to the table ;)
Misunderstandings in user interfaces have caused massive catastrophes (like the Hawaiian Missile Alert in 2018).
Simple names can create collisions when URLs, variables, or objects in code have the same name. It’s like having two houses with the same address—nothing will work right.
My own name causes massive problems with background checks, because my name is too common in my generation.
Names in systems are supposed to be unique, clear, and well-defined. But in practice, that’s not so easy—especially when we get into flexible, but ambiguous systems.
Even the exploration of Prototype Theory can make my head spin a little bit. Maybe nothing means anything at all…? Is it even possible to communicate well? Is what I know just an illusion that unravels under scrutiny?
Cyborg
Naming is exceptionally complex, even though it feels easy.
Ask a child to tell you the name of a new toy or stuffed animal. They’ll probably use some feature to define it: “Spot,” “Pinky,” “Princess,” “Laser-eyes.”
It is exceptionally easy to lean on a feature as a way to put a line around something, but life is more complex than just describing pieces that we perceive.
Objects, concepts, and people aren’t so easily categorized. And that’s simultaneously freeing and frustrating.
We will continue to play with our boundaries as people with different experiences, identities, and ideas look at the “definitions” others have made.
We employ our greatest creativity and problem-solving when we engage with those fuzzy boundaries. Whether that’s to name some piece of code or challenge oppressive systems by creating more space for more people.
To name something is to take on a responsibility: to allow enough flexibility to not break the system, but have enough structure that the system can stand.