Bad Faith
The other day, a friend of mine was preaching that his country should allow illegal immigrants to stay. “They are just looking for a better life!” he said. He was appealing to an unstated moral assumption: that we owe charity to those in worse circumstances. He was preaching that his country should fulfill this moral obligation.
I said “What if a homeless man broke into your house, ate food from your fridge, and went to sleep on your couch. Would you let him stay? After all, he is just looking for a better life.”.
My friend seemed confused. He said “That’s different…”, but then he stopped, because he couldn’t explain how it was different.
I knew what the difference was. There were no costs associated with his support of illegal immigration. He had no significant effect on the policies of his country. So, even if illegal immigration was harmful to him, he had no incentive to politically oppose it. Also, the costs of illegal immigration are paid by all members of society, while the benefits of virtue-signaling go to the signaler. His political virtue-signaling was a selfish attempt to claim moral status, at no cost to himself, and at the expense of his country. By contrast, if he was charitable in his personal life, he would pay the full cost himself. So, he only activated the moral principle of charity in a political context, when he didn’t have to pay the cost. He was politically altruistic, but personally selfish.
See Democracy is a Tragedy of the Commons.
He was not aware of his hypocrisy. He was genuinely confused when I pointed it out to him. He believed that he was altruistic, and that his political views were simply an expression of his character. He was self-deceived.
Dishonesty can be conscious, subconscious or somewhere in-between.
A lie is consciously dishonest. The liar says something that he does not believe. In the moment of lying, he has the lie in his mind, as an idea, but he views it as false. The liar is fully aware that he is lying.
Imagine a man who has committed murder, and is now being questioned by the police. He knows that he has committed murder, but he denies it. To support that big lie, he generates many little lies. He presents a false alibi. As the evidence is shown to him, he incrementally changes his story to fit the evidence, but he continues to insist that he did not commit the crime. He makes a conscious effort to deceive the police.
Lying is difficult. It requires a lot of mental effort, because the liar has to keep two different models in his mind simultaneously: what he actually believes, and what he wants others to believe. Subconscious dishonesty avoids that problem, because only one idea is present in the mind.
Self-deception does not reside in ideas. It resides in how ideas come to mind. It is in the frame of thought, not thought itself. It is learned and applied subconsciously.
Hypocrisy is often subconscious. The hypocrite has learned to activate different moral frames in different situations. This happens below the level of conscious awareness.
That’s why my friend was not aware of his hypocrisy, and was confused when I pointed it out. His political beliefs had been subconsciously selected to serve his interests. They were only active in a context where virtue-signaling was socially useful. When his personal wealth or comfort were at stake, a different moral frame was activated.
We think inside frames. A frame is a subconscious view of a situation or issue. In conscious dishonesty, the dishonesty is inside the frame. It is in the awareness of the liar. In subconscious dishonesty, the dishonesty is in the frame itself, or in the selection of a frame. It is hidden from the self. The mind is deceived by the brain.
You can expose subconscious dishonesty (yours or someone else’s) by creating a new frame that includes the dishonesty. This new frame usually involves a detached perspective: viewing the self in a scientific or philosophical way. The self is brought into the frame as an object of inquiry. The dishonest frame is also brought into the new frame as an object of inquiry.
With my friend, I did this by pointing out his hypocrisy. Normally, the political and personal were compartmentalized. They were different frames, which were activated in different situations. By juxtaposing them, I exposed his dishonesty.
When subconscious dishonesty is exposed (usually by someone else), the dishonest person becomes confused. He experiences cognitive dissonance. This is caused by the conflict between conscious and subconscious belief.
My friend believed that he was altruistic, but he had never thought about it. He took it for granted, subconsciously. When I exposed this belief as false and self-serving, he was deeply confused. In the new frame, he could see his selfishness, but he still intuitively believed that he was altruistic.
Most beliefs are superficial and easily changed. If my friend believed that it was raining, and I showed him the blue sky, he would quickly change his mind. He might be slightly embarrassed at his mistake, but he would acknowledge it.
Other beliefs are harder to change, because they have deeper roots.
Subconscious dishonesty does not immediately disappear when it is exposed. It was not created by rational thought, so it cannot be easily destroyed by rational thought. It takes time and effort to overcome subconscious dishonesty. To unlearn it, you must expose it to honest, critical examination, from a detached perspective. This can be painful.
When subconscious dishonesty is exposed, people usually react defensively. They try to rationalize it, or they avoid discussing it. They often become hostile.
People often use dishonest tactics to defend their political, moral and religious beliefs. They do mental gymnastics. They quibble over terms. They use insults and personal attacks. Sometimes, they just assert the false belief over and over, emphatically. When all else fails, they simply run away from the debate, and from the issue. These tactics could be mostly subconscious: a learned response pattern. However, they seem to involve some degree of conscious dishonesty.
It is hard to change deeply entrenched beliefs, especially in the heat of the moment. An entrenched belief has a strong subconscious basis. It feels right, even if it is demonstrably wrong. This conflict creates cognitive dissonance. People often try to resolve such conflicts with post hoc rationalization.
The attempt to rationalize a dishonest belief is analogous to the confabulations of a murderer trying to deny his guilt. The murderer will change his story if the police present him with more evidence. Similarly, the rationalizer will change his argument if it fails. If one argument is defeated, he will employ another, and then another. He will eventually retreat into a rhetorical swamp, such as tactical nihilism, rather than admit his “guilt”. Over time, his bad faith becomes more and more obvious.
Political, moral and religious beliefs are usually dishonest. They are not used to model reality, or solve practical problems. They are used to signal virtue, define identity, and advocate for oneself. These social functions pull them away from the truth. They are not selected to be true. They are selected to be socially useful.
The distinction between delusion and deception is fuzzy, especially for self-deception.
In Social Delusions, I explained how social dynamics can generate mass delusions. Social delusions often involve some degree of deception. For example, most crowd manias involve status-signaling. People are not just fooled into adopting false beliefs. They adopt false beliefs to fool others: to create a pretense of virtue or intellect.
Self-deception is both a delusion and a deception. The self is deluded, subconsciously, into having false beliefs about itself. This delusion is selected to deceive others. The others are also engaged in deceiving themselves and others. Individual self-deception, such as my friend’s hypocrisy, is usually linked to a mass delusion/deception, such as morality. People collaborate to deceive themselves.
Subconscious dishonesty is not limited to political, moral and religious beliefs. It can occur whenever a belief has social functions. A scientist whose career is based on a certain theory will find it very hard to admit that the theory is wrong. He will find ways to avoid that conclusion. In doing so, he might not be aware of his dishonesty.
There is an important distinction between inquiry and advocacy. Through inquiry, we seek understanding. Through advocacy, we seek to advance our interests within society. Inquiry and advocacy are different modes of thought. They proceed in opposite directions. Inquiry seeks the best conclusion, given evidence and arguments. Advocacy begins with a conclusion, and then seeks evidence and arguments to support it.
Advocacy is a big part of ordinary life. Knowledge is shaped by how we use it. If knowledge is used to support advocacy, it will be shaped by that function. Advocacy distorts our knowledge of reality, especially regarding ourselves and society. We develop false knowledge, because it is socially useful.
Most of our social interactions are dishonest in some way. We mask our feelings and thoughts. We interact through a veil of politeness. We try to shape how others view us. We conform to the expectations and desires of others, while pretending that we are simply “being ourselves”. We pretend that we are intrinsically “good”, rather than controlled by incentives.
This outer layer of deception propagates inward. It would be too difficult to maintain if we were fully aware of it. So, we learn to be dishonest at a deep level.
Although self-deception is mostly subconscious, it always involves some degree of awareness and will. We might not always be aware of how we deceive others, but we know that we are not entirely honest, and we have the ability to critique ourselves. Yet, very few people look at themselves critically. To some extent, that is a choice.
Society creates a fog of delusion and deception. It not only makes us in bad faith with each other, it makes us in bad faith with ourselves.