Anchoring in a Social Context: How the Possibility of Being Misinformed by Others Impacts One's Judgment (original) (raw)

Contextual Debiasing and Critical Thinking: Reasons for Optimism

Topoi, 2016

In this article I argue that most biases in argumentation and decision-making can and should be counteracted. Although biases can prove beneficial in certain contexts, I contend that they are generally maladaptive and need correction. Yet critical thinking alone seems insufficient to mitigate biases in everyday contexts. I develop a contextualist approach, according to which cognitive debiasing strategies need to be supplemented by extra-psychic devices that rely on social and environmental constraints in order to promote rational reasoning. Finally, I examine several examples of contextual debiasing strategies and show how they can contribute to enhance critical thinking at a cognitive level.

Constructive biases in social judgment: Experiments on the self-verification of question contents

Journal of Personality and Social Psychology, 1996

Merely thinking about a proposition can increase its subjective truth, even when it is initially denied. Propositions may trigger inferences that depend not on evidence for truth but only on the semantic match with relevant knowledge. In a series of experiments, participants were presented with questions implying positive or negative judgments of discussants in a videotaped talk show. Subsequent ratings were biased toward the question contents, even when the judges themselves initially denied the questions. However, this constructive bias is subject to epistemic constraints. Judgments were biased only when knowledge about the target's role (active agent vs. passive recipient role) was matched by the semantic-linguistic implications of propositions (including action verbs vs. state verbs).

Peering Into the Bias Blind Spot: People's Assessments of Bias in Themselves and Others

Personality and Social Psychology Bulletin, 2005

People tend to believe that their own judgments are less prone to bias than those of others, in part because they tend to rely on introspection for evidence of bias in themselves but on their lay theories in assessing bias in others. Two empirical consequences of this asymmetry are explored. Studies 1 and 2 document that people are more inclined to think they are guilty of bias in the abstract than in any specific instance. Studies 3 and 4 demonstrate that people tend to believe that their own personal connection to a given issue is a source of accuracy and enlightenment but that such personal connections in the case of others who hold different views are a source of bias. The implications of this asymmetry in assessing objectivity and bias in the self versus others are discussed.

Reasoning about others' reasoning

Journal of Experimental Social Psychology, 2013

► Participants were shown intuitive but incorrect solutions to reasoning problems. ► Some participants were told that these were the responses of other people. ► Others were not told anything about the source of those solutions. ► Participants detected more biases and reasoned better when judging others' responses. ► But this was the case only for participants who showed the bias blind spot.

Collectively jumping to conclusions: Social information amplifies the tendency to gather insufficient data

Journal of Experimental Psychology: General, 2021

False beliefs can spread within societies even when they are costly and when individuals share access to the same objective reality. Research on the cultural evolution of misbeliefs has demonstrated that a social context can explain what people think, but not whether it also explains how people think. We shift the focus from the diffusion of false beliefs to the diffusion of suboptimal belief-formation strategies, and identify a novel mechanism whereby misbeliefs arise and spread. We show that, when individual decision-makers have access to the datagathering behaviour of others, the tendency to make decisions on the basis of insufficient evidence is amplified, increasing the rate of incorrect, costly decisions. We argue that this mechanism fills a gap in current explanations of problematic, widespread misbeliefs such as climate change denial.

False polarization: Debiasing as applied social epistemology

Synthese 191.11: 2529-2547., 2014

False polarization (FP) is an interpersonal bias on judgement, the effect of which is to lead people in contexts of disagreement to overestimate the differences between their respective views. I propose to treat FP as a problem of applied social epistemology—a barrier to reliable belief-formation in certain social domains—and to ask how best one may debias for FP. This inquiry leads more generally into questions about effective debiasing strategies; on this front, considerable empirical evidence suggests that intuitively attractive strategies for debiasing are not very effective, while more effective strategies are neither intuitive nor likely to be easily implemented. The supports for more effective debiasing seem either to be inherently social and cooperative, or at least to presuppose social efforts to create physical or decision-making infrastructure for mitigating bias. The upshot, I argue, is that becoming a less biased epistemic agent is a thoroughly socialized project.

Feeling Validated Versus Being Correct: A Meta-Analysis of Selective Exposure to Information

A meta-analysis assessed whether exposure to information is guided by defense or accuracy motives. The studies examined information preferences in relation to attitudes, beliefs, and behaviors in situations that provided choices between congenial information, which supported participants' pre-existing attitudes, beliefs, or behaviors, and uncongenial information, which challenged these tendencies. Analyses indicated a moderate preference for congenial over uncongenial information (d ϭ 0.36). As predicted, this congeniality bias was moderated by variables that affect the strength of participants' defense motivation and accuracy motivation. In support of the importance of defense motivation, the congeniality bias was weaker when participants' attitudes, beliefs, or behaviors were supported prior to information selection; when participants' attitudes, beliefs, or behaviors were not relevant to their values or not held with conviction; when the available information was low in quality; when participants' closed-mindedness was low; and when their confidence in the attitude, belief, or behavior was high. In support of the importance of accuracy motivation, an uncongeniality bias emerged when uncongenial information was relevant to accomplishing a current goal.

When Sources Honestly Provide Their Biased Opinion: Bias as a Distinct Source Perception With Independent Effects on Credibility and Persuasion

Personality and Social Psychology Bulletin, 2019

Anecdotally, attributions that others are biased pervade many domains. Yet, research examining the effects of perceptions of bias is sparse, possibly due to some prior researchers conflating bias with untrustworthiness. We sought to demonstrate that perceptions of bias and untrustworthiness are separable and have independent effects. The current work examines these differences in the persuasion domain, but this distinction has implications for other domains as well. Two experiments clarify the conceptual distinction between bias (skewed perception) and untrustworthiness (dishonesty) and three studies demonstrate that source bias can have a negative effect on persuasion and source credibility beyond any parallel effects of untrustworthiness, lack of expertise, and dislikability. The current work suggests that bias is an independent, but understudied source characteristic.