By Guy Hochman
The Illusion of Rationality
We are all living examples of biases. We buy things we don’t need (or overpay for the things we do), defend decisions we don’t always support, and convince ourselves the cause justifies any means.
For decades, economists and psychologists have portrayed humans as Homo economicus—rational agents of calculation—or, more charitably, Homo heuristicus—efficient simplifiers navigating a complex world.
But neither description quite captures who we are. We are not cold calculators, nor purely intuitive minimalists. We are Homobiasos: the species that rationalizes with eyes wide shut, weaving stories that make our choices appear reasonable, moral, and coherent—even when they are not.
We do not simply fail to see reality; we interpret it through comforting filters. We make decisions, then justify them. We deceive ourselves not because we are “irrational,” but because we crave meaning and consistency. Our eyes are wide open physiologically, yet psychologically half-closed.
From Information Processing to Narrative Defense
Research on decision-making often assumes that analytic thinkers make fewer errors, while intuitive ones are biased.
Yet even when outcomes are suboptimal, the underlying process is often integrative. Process-tracing studies demonstrate that people consider multiple cues, testing competing explanations rather than relying on any single shortcut.
As Ayal and Hochman showed, participants in complex choice tasks often compared options across dimensions before settling on one, suggesting that bias arises not from laziness but from motivated coherence.
This integrative, story-driven reasoning lies at the heart of Homobiasos. We are not machines that process information; we are storytellers aligning facts with our preferred self-image.
Two Minds, One Motive
Dual-system theories, which portray human reasoning as a constant shift between System 1 (intuitive) and System 2 (deliberative), still dominate decision science. Yet this dichotomy oversimplifies the mind’s complexity. In a recent comprehensive review, I demonstrate how intuition can integrate information elegantly, while deliberation can distort it.
In line with these claims, Ayal et al. and Krava et al. found that deliberation often increases the weight of irrelevant cues, legitimizing poor judgments through eloquent reasoning. By contrast, in previous work, we showed that reliance on intuition can actually reduce biases.
Both systems serve the same master: the need to feel internally justified. We reason not to correct our instincts, but to defend them.
Rationalization as Moral Anesthesia
This defensive logic also governs our moral decisions.
Most people consider themselves honest, yet cheat just enough to profit without feeling corrupt, as a recent review shows. This is obtained mainly through justifications, the stories we tell ourselves to explain why doing bad is actually good. This psychological mechanism is so powerful that our participants, who lied for altruistic reasons (“benefit a charity of their choice”), cheated more but displayed lower physiological arousal, even fooling an experienced lie detection examiner.
Our findings suggest that justifications are effective because they alleviate the psychological tension we experience when our values conflict with our actions.
The mind, it seems, can reframe dishonesty as virtue and suppress its emotional cost. This so-called ethical dissonance motivates people to alter perception rather than behavior.
As Nietzsche warned, “We lie to ourselves more than we lie to others.” Moral rationalization, in that sense, is not corruption but self-protection through motivated reasoning and narratives.
This is Homobiasos at work: reason as psychological anesthesia, dulling the pain of contradiction.
The Comfort of Not Knowing
Sometimes, self-protection takes the form of selective blindness.
In financial decision research, people often avoid information that could threaten their sense of competence or fairness. In a recent review, we demonstrate that motivated ignorance in pension decisions is not merely apathy or a cognitive bias, but rather a coping mechanism—a means of preserving dignity amid systemic mistrust.
This avoidance is not ignorance in the trivial sense—it’s an existential strategy. Admitting uncomfortable truths or uncertainty is to admit vulnerability, and few minds tolerate that for long. As Mark Twain so eloquently observed, “The truth has no defense against a fool determined to believe a lie.”
Ignorance serves as a moral justification to avoid uncomfortable truths while maintaining self-worth. We prefer coherence to accuracy, self-respect to insight. The refusal to know is itself a bias—one that feels perfectly rational and “correct” from the inside.
Anxiety and the Modern Mind
Even our relationship with technology follows the same logic.
In another paper, we identified anticipatory and annihilation anxieties—the fears of change and self-annihilation triggered by exposure to AI. Our results revealed a U-shaped curve: moderate engagement with AI reduces anxiety, while both avoidance and overexposure heighten it.
Humans regulate not for the sake of truth or progress, but for psychological equilibrium. We explain our fear as logic, our hesitation as prudence—but both stem from the same desire to maintain emotional balance.
Why We Rationalize
Across all domains, from choice to morality to technology, the same pattern emerges.
Humans are not driven by logic but by the preservation of self-coherence. Biases are not simply errors based on gut feelings; they are emotional compromises between truth and identity.
We distort facts to stay whole, not to deceive. Biases become a form of meaning management: the narrative mechanism that keeps societies functional, relationships stable, and selves intact.
The danger lies not in simply having biases but in the lack of understanding that they might be self-serving stories.
Eyes Wide Shut
To recognize ourselves as Homobiasos is not to despair of reason but to understand its function. Rationalization is the price of psychological stability, the mind’s way of keeping us whole amid a complex and contradictory reality.
Yet awareness matters.
If we can notice when our stories comfort rather than clarify, we can choose when to open our eyes.
The mirror of Homobiasos reflects not stupidity, irrationality, or bad morals, but the true nature of humanity. Our stories are not bugs in the system but features.
We may never be purely “rational,” but we can at least choose to be more honestly biased.
This article was edited by Lachezar Ivanov.

