Hailed always as a virtue, consistency provides us with a lot of value in our social systems. No doubt we need consistency in our work ethic to keep things running, we need consistency in the application of laws to be able to trust the system, we need consistency in order to form deep bonds with one another.
But is there a catch?
An Attitude Change
In 1959, Leon Festinger and J. Merrill Carlsmith ran an experiment. A participant coming to the lab would be given two tasks: one for the first half-hour and the other for the last. Refilling trays with spools and turning wooden pegs on a board were the mundane assignments.
We'll follow the experience of a single participant—we'll call him Jerry—so we can keep track of what's going on.
After doing the two tasks for an entire hour, Jerry emerges exceptionally bored and happy to not have to do tasks like those again.
The experimenter then comes in and “reveals” that the point of the study was to see if the performance of the tasks was affected if people were told beforehand that they are fun and enjoyable.
There was also a problem, the experimenter confides. The research assistant didn’t show, so the experimenter asks if Jerry could tell the next person in line that he had just finished the task (truth) and that it was really fun (lie). There would be compensation, of course, for sticking around and helping prime the next person.
The person Jerry lied to then goes to do their tasks and once they are finished, Jerry gets interviewed to see how fun he really thought the tasks were.
The variable in this experiment is that some people were promised $20 to lie to the next participant, some were promised $1, and there was a control group where there was no payment and no need to lie.
We know that the tasks were very dull (and long), so by the end of the two-hour experiment, would you be surprised if the attitude towards the task was influenced by the amount of money promised?
Which amount made people rate the tasks as more interesting: the $1 group or the $20 group? Take a look at the results:
Surprisingly enough, the $1 group was staggeringly more convinced that the tasks were fun and enjoyable.
What caused this drastic attitude shift? Cognitive dissonance.
The non-control groups had to do something to justify or rationalize the lie they expressed. The $20 group still shows an attitude shift, but remains on the more honest—or at least consistent—side.
The $1 group appears to have shifted their attitude because the monetary reward was not enough to justify their actions. Changing your mind turns out to be an easy way to get rid of the discomfort of being out of alignment.
“It actually was really nice to do something meditative,” someone might have said. “It was a refreshing break,” or “I love playing with blocks!” others could have told themselves.
However you do it, the cognitive dissonance is what is driving you to find a reason for feeling, doing, or believing one way, then contradicting that with our behavior.
Leon Festinger’s cognitive dissonance theory suggests that “although we may appear logical in our thinking and behavior, we often engage in seemingly irrational behavior to maintain cognitive consistency. It also describes and predicts how we spend much of our time rationalizing our behavior rather than actually engaging in rational action.” Social Psychology (Fifth Edition), Stephen L. Franzoi, pg. 162
Mirage
One of the most common virtues we seem to seek is consistency. We want to know that something will work when we use it. We want to know you’ll show up—and keep showing up. We want leaders who will be predictably consistent in their views and actions.
Variables are risky.
The paradox is not that consistency is both bad and good. It’s that consistency is developed through variability.
Take a look at defense attorneys. In the U.S., the 6th Amendment of the Constitution guarantees the right to a fair trial with “the Assistance of Counsel” for defense. This right also includes an “impartial jury.”
There is something admirable about the ethics of defense lawyers. You don’t get to defend only the innocent. You will defend guilty people. What is admirable is how these lawyers do their best to remove their personal feelings and try to let the law reveal what needs to be revealed. Perhaps that is a guilty verdict. Perhaps that is a mistrial, or an affirmation of innocence.
One of the greatest demonstrations of impartiality is being able to rule against your own personal beliefs when doing so is the right thing to do. Judges must do this, Supreme Court Justices are supposed to do this.
Consistency in these cases is not about always taking the same stance in every situation. It’s actually about the flexibility to be able to apply the appropriate action in every situation. It’s a consistent inconsistency.
(Makes me wonder how the cognitive dissonance theory maps to those specific roles and careers?)
The Bias of Bias
I’m so often caught in binaries—thinking of the world in terms of two options: on or off. Yes or no. Good or bad. Black or white.
In theory, we can draw out models and make arguments for these binary systems, but in practice, there is so much more “gray” than we admit. We live as complex beings within complex systems, and although merisms and polarity can help us understand things more quickly at first, the more we explore, the more nuance we'll find.
The real crime of consistency is the false advertising. It’s a great factor in how we can judge something’s value or trustworthiness. It can’t be the only factor, because it hides so much.
We see this through technology all of the time, especially with social media. The “Instagram” life illustrates it well: always happy, beautiful, perfect. Filtered indeed.
Cognitive dissonance theory implies that though we may perceive this “consistency” as an observer, we are not perceiving all of the rationalization that is happening in order to appear consistent.
The more I explore topics like this regarding psychology and human behavior, the more I’m realizing that the binary of “good or bad” just doesn’t serve me well. I feel inclined to shame-based judgment of rationalization to rid ourselves of cognitive dissonance, but that’s overly self-righteous and hypocritical.
Instead, what is needed is an awareness of these tendencies, biases, logical fallacies, etc. Then we need an acceptance that these things will be reflected in ourselves. After all, there's a reason these things occur—they may not be helpful all of the time, but for whatever reason, they benefited the brain in some way. Now it's up to us to decide when to go deeper, when to take shortcuts, when to change.
Ironically, just knowing about biases or fallacies does not prevent us from falling into them. It does provide us with excellent prompts for introspection, though.
Even knowing that we are fallible, that we may be wrong, but that we’re still here and still trying is powerful. We learn, we adapt, we change over time. Change is the actual goal.
It's consistent inconsistency.