Prologue: Where Is the Truth?
Late at night, Ella's phone vibrated again. In the family chat group, her aunt had forwarded an article about vaccine dangers, with the caption: "Did you see this? Too many children have problems after vaccination! Everyone must be vigilant!"
Ella took a deep breath and clicked on the authoritative medical research report she had specially prepared——detailed data, rigorous logic. She took screenshots and sent them to the group, @-ing her aunt: "Auntie, this is the latest large-scale study. The data shows that vaccines are safe and effective."
Ten minutes later, her aunt replied: "These so-called experts have all been bought off by interest groups."
Ella was stunned. Not because of her aunt's stubbornness——she was already accustomed to that——but because she suddenly realized: in this era of information explosion, we have unprecedented ability to access the truth, yet we also live in our own truths more than ever before.
This is not an isolated case. On social media, debates about climate change, genetically modified foods, and education policy are staged every day. Everyone holds "facts," everyone has "evidence and reason," but no one is persuaded. Debate is not a bridge to consensus, but an expressway to division.
This forces us to face an unsettling question: why do we remain stubborn even when the facts are right in front of us? Is there a problem with logic, or does human nature inherently resist change?
Perhaps the answer to this question is more complex——and more disturbing——than we imagine. Because what truly prevents us from accepting the truth is not a lack of intelligence, but precisely the rationality we pride ourselves on as our weapon.
The Cognitive Level: How Rationality Becomes an Accomplice of Bias
Confirmation Bias: The Cycle of Self-Proof
In his late-night study, Thomas rubbed his tired eyes while searching for information online. As an experienced engineer, he considered himself a rational person. But at this moment, he found himself only clicking on links that supported his views, quickly skimming past opposing voices——even when those opposing opinions came from authoritative journals. He knew what he was doing, but couldn't stop. Evidence that aligned with his expectations was like sweet temptation, making him feel reassured and correct.
This experience is not unique to Thomas. Every one of us lives in this paradox: self-proclaimed rational beings, yet often prisoners of irrational thinking.
In 1943, psychologists discovered a disturbing phenomenon: once people form an opinion, they unconsciously seek evidence that supports it while ignoring information that contradicts it. This tendency was named "confirmation bias."
In Nickerson's classic study, confirmation bias was described as "the tendency to favor existing beliefs, expectations, or hypotheses when seeking or interpreting evidence." We are not impartial seekers of truth, but more like lawyers carefully screening evidence, only to prove our predetermined conclusions.
This bias has deep evolutionary roots. When our ancestors on the savannah heard rustling in the grass, if they immediately assumed a predator and fled, even if nine out of ten times it was a false alarm, their survival probability was far higher than those who optimistically thought "it's just the wind." Suspicion and confirmation were once life-saving tools in evolutionary history.
However, in modern society, this mechanism often backfires. Social media algorithms reinforce our existing views, wrapping us in comfortable echo chambers. We like content we agree with, block dissenting voices, and sink deeper into the cycle of self-confirmation.
Even more puzzling is that even when faced with solid counter-evidence, people still find it difficult to abandon false beliefs.
Belief Perseverance: When Errors Become Part of Self
In their 2023 study, Siebert and colleagues revealed a stunning fact: even when false information is explicitly retracted, 68.5% of people still persist in views formed based on that false information. Just like Thomas, who knows certain information sources are unreliable, yet still finds it difficult to shake his inner certainty.
Why is this? Because our beliefs are not merely information storage in the mind, but components of self-identity. Abandoning a belief, especially those we have publicly expressed and argued for, is like denying a part of ourselves.
Philosopher William James once said: "When a person finds himself compelled to abandon a proposition he once firmly believed in, the pain in this process is no less than physical pain." We would rather maintain a false belief than endure the psychological pain brought by cognitive dissonance.
How Rationality Becomes an Accomplice of Bias
Most ironically, our rational abilities, far from correcting these biases, often become their accomplices.
When we receive information that conflicts with existing beliefs, the brain doesn't simply accept or reject it. Instead, it activates a sophisticated defense mechanism: questioning the credibility of counter-evidence, finding loopholes in explanations, mobilizing all cognitive resources to maintain the original view. The smarter we are, the more we can build solid logical fortresses for our biases.
This is why highly intelligent people can also fall into superstition and conspiracy theories——they just use more complex reasoning to defend their biases.
In their 2020 study, Mischel and Peters found that confirmation bias exists not only in advanced reasoning but also in basic perception and confidence judgments. Even in simple visual judgment tasks, once people make an initial decision, subsequent perceptual systems will bias toward finding evidence supporting that decision. This bias even appears in macaques and rats, suggesting it may be a core feature of brain information processing, rather than a failure of rationality.
From "Anti-Vaccine" to Political Stances: Facts Don't Change Beliefs, Beliefs Filter Facts
Returning to the argument between Ella and her aunt. Her aunt believes vaccines are harmful not because she lacks information——quite the contrary, she has a large amount of "evidence." This evidence comes from bloggers she trusts, articles forwarded countless times, and parents who claim "children showed abnormalities after vaccination."
When Ella presented authoritative medical research, what did her aunt's brain do? She didn't objectively evaluate the data but immediately activated defense mechanisms: These experts were paid off; this study's sample size isn't large enough; there must be something hidden behind those numbers.
Psychologists have found that when evaluating evidence, people give higher weight to information supporting existing beliefs while tending to question the credibility of opposing evidence or simply ignore it. This is not deliberate deception but the automatic operation of the cognitive system.
In the political arena, this phenomenon is even more pronounced. The same economic data can be interpreted by conservatives as "the success of market liberalization" and by progressives as "the widening of wealth gaps." Studies show that when people encounter information conflicting with existing attitudes, they not only don't change their positions but become more firmly entrenched in their original views——this is called the "backfire effect."
Truth is no longer a conclusion reached through logical deduction but an endpoint predetermined by belief. We are not searching for truth, but seeking proof of what we already believe.
The Possibility of Breaking Bias?
Faced with such deep-rooted cognitive biases, are we helpless?
Research brings a glimmer of hope. After comparing three methods of correcting belief perseverance bias: "counterstatement" (directly providing arguments for opposing viewpoints) proved most effective, better than simply asking people to think for themselves at breaking bias. While "awareness training" (teaching people to recognize belief perseverance bias itself) was slightly less effective, it has broad application value because of its ease of implementation.
This means that while we cannot completely escape cognitive biases, through specific mental training and external intervention, we can reduce their impact to some extent.
But the key question is: are we willing to accept such training? Or will our pride tell us: "Others need to correct their biases, but I don't"?
The Emotional Level: Persuasion Means "Self Being Questioned"
When Beliefs Become Part of Self
In 1956, social psychologist Leon Festinger recorded a thought-provoking story: a religious group's leader prophesied that the world would be destroyed by flood on a specific date, and only believers would be saved by aliens. Believers quit their jobs and abandoned their possessions for this, devoutly waiting.
The prophesied date passed, and no flood occurred.
However, surprisingly, these believers' loyalty to the leader not only didn't decrease but increased. They explained: it was precisely because of their devout prayers that God withdrew the punishment.
Festinger explained this phenomenon with "cognitive dissonance" theory——when facts severely conflict with beliefs, people won't easily abandon their beliefs but instead double down, because admitting error means the entire worldview collapses. This is not simple stubbornness but emotional self-protection.
When a person says "I was wrong," they are not merely correcting a cognitive error but undertaking an emotional adventure——risking the admission that the worldview that constructed their self-cognition was actually built on quicksand.
The Emotional Cost of Cognitive Consistency
Indiana University professor Leah Savion, in "Clinging to Discredited Beliefs: The Larger Cognitive Story," points out that our brains naturally pursue cognitive consistency. This pursuit is not only a cognitive labor-saving strategy but also an emotional survival mechanism.
The "pet theories" we develop——worldviews formed based on personal experience——become our maps for navigating this complex world. Questioning these theories is like questioning our own navigation abilities.
Savion proposed a sophisticated model: our cognitive mechanisms are governed by principles of economy and balance. To efficiently process massive amounts of information, the brain has developed various heuristic thinking and cognitive shortcuts. These mechanisms are extremely effective in daily life but also bring inevitable side effects——belief fixation.
When we encounter evidence conflicting with existing beliefs, emotional discomfort triggers a subtle defense mechanism: distorting information, devaluing it, or limiting its validity to specific contexts.
Why Is "I Was Wrong" So Difficult?
Modern neuroscience research provides new perspectives for understanding this phenomenon. A 2016 study published in Scientific Reports, a Nature journal, found that when people process information conflicting with their beliefs, brain regions associated with physical pain are activated.
This means cognitive dissonance is not just psychological discomfort——it has a real physiological basis. The brain regions activated when admitting error partially overlap with those active when the body is injured. This perhaps explains why we feel such genuine pain when proven wrong. Our brain seems to be warning us in the most primitive way: a threatened belief equals a threatened survival.
Another interesting study from PMC 2016 explored the role of "self-concept" in belief maintenance. The research found that those who view intelligence as a fixed trait find it harder to accept criticism and admit errors, because for them, making mistakes means "I'm stupid"; while those who view intelligence as a developable trait are more likely to see errors as learning opportunities.
This indicates that our understanding of "self" profoundly affects our ability to accept new information.
The Emotional Ecosystem of Beliefs
Professor Savion views belief perseverance as part of a cognitive ecosystem, not a defect that needs to be eradicated. She believes these "stubborn" beliefs exist because they serve important psychological functions——maintaining a sense of control, providing a coherent worldview, protecting a fragile self.
From this perspective, attempting to change a person's beliefs is like attempting to change an ecosystem's balance. Simply providing more factual evidence often backfires because it threatens the entire system's stability. Real change requires understanding the system's operating rules and finding ways to introduce new elements without disrupting overall balance.
Each of us has a complex "psychological immune system" inside that is responsible for reducing cognitive dissonance, balancing resource management, maintaining default states, and generating and maintaining self. When we encounter information challenging our beliefs, this system quickly activates, using various strategies to minimize threats.
The Gentle Revolution: How to Coexist with Stubborn Beliefs?
If forceful persuasion often backfires, how should we promote real change?
Savion proposes the concept of "super active learning"——not telling people the right answer but creating experiences that let people discover the limitations of existing beliefs themselves.
The core of these methods is lowering defenses. When a person is not being "taught" but is "teaching others," their emotional defenses naturally lower. When they need to explain a concept to someone else, they must face holes in their own understanding; when they participate in role-playing, they can see problems from another perspective; when they draw concept maps, they can intuitively see contradictions in their belief system.
The subtlety of this approach is that it respects the emotional dimension of belief systems. It doesn't directly attack with "you're wrong" but lets individuals reach conclusions through experience. This process still involves discomfort, but this discomfort is part of self-discovery, not an attack from outside.
Embracing the Imperfect Knower
Understanding the emotional dimension of beliefs ultimately gives us more compassion for ourselves and others.
Those seemingly "irrational" belief adherences often have deep needs for coherence, control, and self-worth behind them. When we mock others for being "stubborn," perhaps we should recall the inner struggle of the last time we admitted a major error ourselves.
True wisdom may not lie in always being right but in the ability to peacefully coexist with our errors; not in having an impeccable belief system but in maintaining the openness and flexibility of the system.
When we can say "I was wrong in the past," we demonstrate not only intellectual honesty but also a kind of emotional courage——daring to face that imperfect self and still cherishing its capacity for growth.
In this sense, the purpose of education should not only be to instill correct knowledge but also to cultivate an ability to coexist with uncertainty, a resilience that doesn't collapse when beliefs are challenged, and the confidence to say "I was wrong" without feeling self-worth is damaged.
This may be the deepest intersection of cognitive science and humanistic spirit——while understanding how we know, we also care for that fragile, complex person who is knowing.
The Social Level: We Belong to Whom, We Believe Whom
The Imprint of Identity: Why We Belong to "Us"
Ella still remembers that when she was young, truth was the ink scent emanating from the thick encyclopedia in her grandfather's study. A fact, in black and white, beyond dispute.
Now, truth is like quicksand, constantly sliding and transforming on her fingertip screen. The family chat group forwards "health knowledge" completely contrary to her cognition; her college friends' circles share political declarations she cannot understand. Every attempt to refute with "facts" brings not rational discussion but a deeper separation——as if she is challenging not just a viewpoint, but them themselves.
She feels an unprecedented loneliness. Not that there's no one around, but that she and those once close to her seem to be living in two completely different worlds.
This is not Ella's predicament alone. We are living in an era of "cognitive tribalization." In this era, beliefs are often no longer a rational inquiry into "what is true" but an extension of identity of "who I am." Our sense of group belonging is shaping the boundaries of truth in our cognition with unprecedented force.
We Confirm "I" Within "Us"
Philosophically, the exploration of "self" has never stopped. From Descartes' "I think, therefore I am" to communitarianism's criticism, the latter believes that "self" detached from social relations is an illusion. We do not first form a complete, independent "I" and then choose to join groups. On the contrary, we gradually confirm "I" within "us."
Family, region, party, class, even the teams we support and the stars we love——these identity labels, like nested Russian dolls, together constitute our coordinates in this world.
Psychologist Henri Tajfel's famous "minimal group paradigm" experiment revealed this instinct with almost brutal simplicity. He randomly divided strangers into two groups, nothing more, without any substantial interaction or common interests. The result: people immediately tended to favor members of their own group and were willing to allocate more resources to them while belittling out-group members.
This means dividing "us" and "them," and thereby producing in-group favoritism and out-group prejudice, is a deeply rooted mechanism in human psychology. It once helped our ancestors unite strength and survive in harsh natural environments. But in today's information explosion, this ancient survival instinct may have become shackles preventing us from recognizing reality.
Echo Chambers and Belief Fortresses: When Identity Trumps Facts
Social media, this tool that should connect the world, under the precise operation of algorithms, has pushed this tribalization to the extreme. It is no longer an open square but countless carefully constructed "echo chambers" and "filter bubbles."
Here, we repeatedly hear echoes of our own views, seeing information that has been screened and reinforces our existing beliefs. Heterogeneous voices are quietly blocked, and the image of "them" becomes increasingly strange and even terrifying in repeatedly simplified and distorted transmission.
In this environment, the focus of discussion has undergone a fundamental shift. We are no longer debating to explore facts but fighting a battle to defend identity recognition. Supporting or opposing a view no longer depends on its inherent logic and evidence but on what "our side" thinks and how "their side" opposes it.
Philosophers have long seen that cognition has never been pure. As Nietzsche said: "There are no facts, only interpretations." Our perspective, our preconceptions, determine what we can see. When our perspective is monopolized by a single group identity, the diversity of interpretation disappears, replaced by a forced consensus.
Motivated Reasoning: When Facts Pale Before Group Boundaries
At this point, a thought-provoking psychological phenomenon appears: motivated reasoning. Our brain is not a computer objectively processing information; it's more like a lawyer who has predetermined conclusions (i.e., conclusions believed by my identity group) early on, then does everything possible to collect supporting evidence and criticize contrary clues.
When iron-clad facts conflict with our group's core beliefs, what is sacrificed is often the facts.
This is not because people are stupid or unreasonable but because the cost of admitting facts might be psychological expulsion from the "our" tribe. This risk of social death is far more terrifying than admitting a cognitive error.
Research shows that when people encounter information conflicting with group beliefs, brain regions related to emotional processing are abnormally active, while regions related to rational analysis are relatively quiet. We are not rationally weighing evidence but emotionally resisting threats.
Therefore, in the face of the enormous gravitational pull of group boundaries, the light of facts and logic often dims. Defending "who we are" has more survival urgency than recognizing "what is true."
The Twilight of Rationality? Finding Human Light in Group Belonging
So, does this mean the twilight of rationality has arrived? Are we doomed to be trapped in fortresses built by our own identities, throwing verbal stones at "others" across high walls?
Looking at the long river of history, the answer doesn't seem entirely pessimistic.
The most brilliant moments in human civilization occurred precisely when individuals could temporarily jump out of their own groups and examine the world from a broader perspective. Socrates drank hemlock defending precisely the pursuit of truth beyond city-state prejudice; Copernicus and Galileo challenged what almost the entire "us" of that time believed about the universe.
Behind this is a deeper psychological dynamic: the desire for "belonging" and the desire for "reality" are equally rooted in human nature. We desire to belong to a family, a community, and likewise, we desire to establish connections with the real world. This desire for reality is the source from which science, art, and philosophy are born.
The key is: can we preserve a place for "my" independent judgment amid the siege of "us"? Can we, while defending identity, not completely close the window for dialogue with the outside world?
Multiple Identities: The Glimmer of Breaking Echo Chambers
Observing those who can resist extreme tribalization to some extent, they often have a quality: possessing multiple, intersecting identity recognitions.
A person can simultaneously be a supporter of a political party, a scientist, an environmentalist, and a parent of a child. When these identities are activated in different situations, they check and balance each other, preventing any single identity from completely dominating their cognitive framework.
When scientific facts about climate change (valued by the scientist identity) conflict with the mainstream view of the affiliated party, concern for the next generation's future as a parent may become a fulcrum for breaking through the echo chamber wall.
This glimmer comes not from cold logical chains but from our common humanity——that primitive ability to perceive others' suffering, be moved by beautiful stories, and remain curious about the unknown.
When we talk about a person with different political views, if we don't just see them as an "enemy" label but imagine they, like us, would be moved by a child's first smile and cry when losing a loved one, then a crack appears in the high wall.
Understanding doesn't equal agreement, but it is the starting point of any meaningful dialogue.
A Small, Fragile Bridge
In the end, Ella didn't send that long message full of data and citations to the family chat group.
She chose to privately message her aunt: "Auntie, I saw the article you forwarded and am worried about your health. Have you been sleeping well lately? Let's go to the park together this weekend."
She understands she cannot instantly demolish a wall built by decades of emotion and recognition with one scientific report. But she can try, beyond the identity of "person with different political views," to reactivate the older, more resilient identity connection between them as "relatives."
This is not abandoning the pursuit of truth but practicing it in a more complex, more humane way——while recognizing "we belong to whom, we believe whom" as a tremendous social force, still believing that above the chasm between "us" and "them," perhaps a small, fragile, but precious bridge can be built.
When facts pale before group boundaries, what we need may not be louder voices but ears that know how to listen better, and a language with warmth that can cross identity barriers.
Epilogue: Wisdom of Coexisting with Bias
Thomas shut down his computer. He stood up and walked to the window, the night tide-like surging. The stars in the night sky flickered coldly, indifferent to human disputes. But for some reason, looking at those distant lights, he felt a strange calm.
He thought of his argument with Ella.
He thought of those data, those arguments, those moments when each side felt they held truth.
He also thought of that sentence Ella said before hanging up: "Maybe we're both right, just seeing different parts of the same elephant."
That sentence, like a small crack, let a ray of light into his solidified thinking.
Not "Either you're right or I'm right."
But "Maybe we're all only seeing part of the truth."
This recognition didn't make him immediately change his views. But it allowed him, for the first time, to truly consider the possibility: what if those who disagree with me aren't stupid or malicious, but just seeing from a different angle?
The Ultimate Wisdom in Dealing with Cognitive Bias
Perhaps the ultimate wisdom in dealing with cognitive bias lies not in eliminating bias——this may be an eternally unachievable goal——but in learning to coexist with them, recognizing that each of us is a "biased rational being." Just as the deepest courage lies not in fearlessness but in continuing forward with fear; the deepest love lies not in having no hurt but in still choosing to trust after being hurt.
The deepest rationality lies in acknowledging rationality's limits.
As philosopher Karl Popper said: "True rationality lies not in clinging to one's beliefs but in the willingness to question them."
The beauty of this statement lies not in providing an answer but in guarding a space——a space where one can say "I'm not sure," a space where one can say "maybe I was wrong," a space where one can say "let's think together."
Each of us is a "biased rational being." This is not a defect; this is the background color of being human. Our biases are the paths we've walked, the people we've loved, the wounds we've suffered, the proof that we existed in this world.
The problem is not that we have biases. The problem is: do we have the courage, at a certain moment, to stop and whisper to ourselves:
"This may just be my bias." "I might be wrong." "Let me think again."
These few words, light as a feather, yet heavy as a mountain.
The Possibility of a Seed
When we understand why humans are so difficult to persuade, when we see clearly those mechanisms that prevent truth from arriving, when we admit we too are biased rational beings,
We might be able to start thinking about another question:
If fortresses under frontal attack cannot be breached, does another possibility exist?
Not bombarding closed gates with cannons, but gently passing over a seed, waiting for it at the right time to sprout on its own, grow on its own, until one day, that solid wall, from within, is quietly arched open by soft life, creating a crack.
Light will come in.
Postscript: A Conclusion Without Conclusion
That weekend, Ella and her aunt went to the park.
They didn't talk about vaccines or politics, just chatted about family matters and watched autumn leaves fall.
But on the way home, the aunt suddenly asked: "That study you mentioned last time, can you send it to me again? I want to read it myself."
Ella smiled. She knew this was not the victory of "persuasion" but the beginning of "connection."
In this increasingly divided world, perhaps this is the best thing we can do: not changing each other but maintaining connection; not proving who's right or wrong but remembering we are all equally fragile, equally longing to be understood people.
Persuasion is difficult, but understanding may still be possible.
And sometimes, understanding itself is already the deepest change.
The night is deep.
The stars are still twinkling.
Truth is still there, waiting for those willing to let go of prejudice and open their hearts.
No rush, take it slow.
After all, we're all on the road.
Even if strangers, there's still the possibility of convergence.
References
Primary Sources
-
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175-220.
-
Wolf, I., & Schröder, T. (2024). The critical role of emotional communication for motivated reasoning. Scientific Reports, 14, 31681.
-
Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press.
-
Albarracín, D., & Wyer, R. S. (2001). Elaborative and nonelaborative processing of a behavior-related communication. Personality and Social Psychology Bulletin, 27(6), 691-705.
-
Savion, L. (2009). Clinging to discredited beliefs: The larger cognitive story. Journal of the Scholarship of Teaching and Learning, 9(1), 81-92.
-
Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108(3), 480-498.
-
Tajfel, H., Billig, M. G., Bundy, R. P., & Flament, C. (1971). Social categorization and intergroup behaviour. European Journal of Social Psychology, 1(2), 149-178.
Cognitive Bias and Belief Perseverance
-
Ross, L., Lepper, M. R., & Hubbard, M. (1975). Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm. Journal of Personality and Social Psychology, 32(5), 880-892.
-
Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098-2109.
-
Taber, C. S., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50(3), 755-769.
-
Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303-330.
Cognitive Dissonance and Self-Concept
-
Harmon-Jones, E., & Mills, J. (Eds.). (1999). Cognitive dissonance: Progress on a pivotal theory in social psychology. American Psychological Association.
-
Dweck, C. S. (2006). Mindset: The new psychology of success. Random House.
-
Harris, S., Kaplan, J. T., Curiel, A., Bookheimer, S. Y., Iacoboni, M., & Cohen, M. S. (2009). The neural correlates of religious and nonreligious belief. PLoS ONE, 4(10), e7272.
-
Sharot, T., & Garrett, N. (2016). Forming beliefs: Why valence matters. Trends in Cognitive Sciences, 20(1), 25-33.
Social Identity and Group Dynamics
-
Tajfel, H., & Turner, J. C. (1979). An integrative theory of intergroup conflict. In W. G. Austin & S. Worchel (Eds.), The social psychology of intergroup relations (pp. 33-47). Brooks/Cole.
-
Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2(10), 732-735.
-
Van Bavel, J. J., & Pereira, A. (2018). The partisan brain: An identity-based model of political belief. Trends in Cognitive Sciences, 22(3), 213-224.
-
Bail, C. A., Argyle, L. P., Brown, T. W., Bumpus, J. P., Chen, H., Hunzaker, M. B. F., ... & Volfovsky, A. (2018). Exposure to opposing views on social media can increase political polarization. Proceedings of the National Academy of Sciences, 115(37), 9216-9221.
Persuasion and Attitude Change
-
Petty, R. E., & Cacioppo, J. T. (1986). The elaboration likelihood model of persuasion. Advances in Experimental Social Psychology, 19, 123-205.
-
Cialdini, R. B. (2006). Influence: The psychology of persuasion (Revised edition). Harper Business.
-
Druckman, J. N., & McGrath, M. C. (2019). The evidence for motivated reasoning in climate change preference formation. Nature Climate Change, 9(2), 111-119.
Neuroscience of Belief
-
Westen, D., Blagov, P. S., Harenski, K., Kilts, C., & Hamann, S. (2006). Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. presidential election. Journal of Cognitive Neuroscience, 18(11), 1947-1958.
-
Kaplan, J. T., Gimbel, S. I., & Harris, S. (2016). Neural correlates of maintaining one's political beliefs in the face of counterevidence. Scientific Reports, 6, 39589.
Philosophical Foundations
-
Popper, K. R. (1959). The logic of scientific discovery. Hutchinson.
-
Popper, K. R. (1962). Conjectures and refutations: The growth of scientific knowledge. Basic Books.
-
James, W. (1890). The principles of psychology (Vol. 1). Henry Holt and Company.
-
Nietzsche, F. (1886/1968). Beyond good and evil: Prelude to a philosophy of the future (W. Kaufmann, Trans.). Vintage Books.
Additional References
-
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
-
Mercier, H., & Sperber, D. (2017). The enigma of reason. Harvard University Press.
-
Haidt, J. (2012). The righteous mind: Why good people are divided by politics and religion. Pantheon Books.
-
Sunstein, C. R. (2017). #Republic: Divided democracy in the age of social media. Princeton University Press.
-
Wilson, T. D. (2002). Strangers to ourselves: Discovering the adaptive unconscious. Harvard University Press.
-
Nisbett, R. E., & Ross, L. (1980). Human inference: Strategies and shortcomings of social judgment. Prentice-Hall.
-
Gilovich, T. (1991). How we know what isn't so: The fallibility of human reason in everyday life. Free Press.
Research on Educational Interventions
-
Lewandowsky, S., Ecker, U. K., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131.
-
Cook, J., & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland.
-
Lilienfeld, S. O., Ammirati, R., & Landfield, K. (2009). Giving debiasing away: Can psychological research on correcting cognitive errors promote human welfare? Perspectives on Psychological Science, 4(4), 390-398.
No comments:
Post a Comment