How Does Reasoning (Fail to) Contribute to Moral Judgment? Dumbfounding and Disengagement (original) (raw)

The psychology of moral reasoning

2008

This article presents a theory of reasoning about moral propositions that is based on four fundamental principles. First, no simple criterion picks out propositions about morality from within the larger set of deontic propositions concerning what is permissible and impermissible in social relations, the law, games, and manners. Second, the mechanisms underlying emotions and deontic evaluations are independent and operate in parallel, and so some scenarios elicit emotions prior to moral evaluations, some elicit moral evaluations prior to emotions, and some elicit them at the same time. Third, deontic evaluations depend on inferences, either unconscious intuitions or conscious reasoning. Fourth, human beliefs about what is, and isn't, moral are neither complete nor consistent. The article marshals the evidence, which includes new studies, corroborating these principles, and discusses the relations between them and other current theories of moral reasoning.

How (and where) does moral judgment work

Why do we care so strongly about what other people do, even when their actions won't affect us? And how do we decide that someone else has done something wrong? These questions are at the heart of moral psychology, and psychologists' answers to these questions have changed with intellectual fashion. Historically, psychologists have disagreed about whether moral judgments are primarily products of emotional and non-rational processes (such as Freudian internalization or behaviorist reinforcement) or of reasoning and 'higher' cognition (as in Piaget's and Kohlberg's post-conventional reasoning). Recently, however, findings from several areas of cognitive neuroscience have begun to converge on an answer: emotions and reasoning both matter, but automatic emotional processes tend to dominate.

Emotion and deliberative reasoning in moral judgment

According to an influential dual-process model, a moral judgment is the outcome of a rapid, affect-laden process and a slower, deliberative process. If these outputs conflict, decision time is increased in order to resolve the conflict. Violations of deontological principles pro-scribing the use of personal force to inflict intentional harm are presumed to elicit negative affect which biases judgments early in the decision-making process.This model was tested in three experiments. Moral dilemmas were classified using (a) decision time and consensus as measures of system conflict and (b) the aforementioned deontological criteria. In Experiment 1, decision time was either unlimited or reduced.The dilemmas asked whether it was appropriate to take a morally questionable action to produce a " greater good " outcome. Limiting decision time reduced the proportion of utilitarian (" yes ") decisions, but contrary to the model's predictions, (a) vignettes that involved more deontological violations logged faster decision times, and (b) violation of deontological principles was not predictive of decisional conflict profiles. Experiment 2 ruled out the possibility that time pressure simply makes people more like to say " no. " Participants made a first decision under time constraints and a second decision under no time constraints. One group was asked whether it was appropriate to take the morally questionable action while a second group was asked whether it was appropriate to refuse to take the action. The results repli-cated that of Experiment 1 regardless of whether " yes " or " no " constituted a utilitarian decision. In Experiment 3, participants rated the pleasantness of positive visual stimuli prior to making a decision. Contrary to the model's predictions, the number of deonto-logical decisions increased in the positive affect rating group compared to a group that engaged in a cognitive task or a control group that engaged in neither task. These results are consistent with the view that early moral judgments are influenced by affect. But they are inconsistent with the view that (a) violation of deontological principles are predictive of differences in early, affect-based judgment or that (b) engaging in tasks that are inconsistent with the negative emotional responses elicited by such violations diminishes their impact.

Emotion’s influence on judgment-formation: Breaking down the concept of moral intuition

Philosophical Psychology, 2019

Recent discussions in the field of moral cognition suggest that the relationship between emotion and judgmentformation can be described in three separate ways: firstly, it narrows our attention to a few salient features of a situation to which we ascribe significance; secondly, it opens up the possibility for critical reflection on the moral ends we've set up for ourselves; and thirdly, it has no significant effect on moral cognition. It has not yet been explained why emotion plays apparently contradictory roles in moral decisionmaking, nor what conditions are necessary to prompt an individual to deliberate on how to realize her moral ends. I argue that emotion is one of multiple, potentially conflicting impulses that occur simultaneously in any given experience of moral intuition. If there is conflict between the action toward which emotion draws us and impulses toward other types of behavior, we are prompted to pause and reflect on how to act, whereas if there is no such conflict, extensive deliberation is deemed unnecessary and is forgone. I explore the implications of my internal conflict model for moral cognition by proposing an explanation for Greene's experimental results which differs from that offered by Greene himself.

Moral Judgment and Decision Making

1. Causal Models: The Representational Infrastructure for Moral Judgment (Steven A. Sloman, Philip Fernbach, & Scott Ewing—Brown University) 2. Moral Grammar and Intuitive Jurisprudence: A Formal Model of Unconscious Moral and Legal Knowledge (John Mikhail—Georgetown University) 3. Law, Psychology & Morality (Kenworthey Bilz & Janice Nadler—Northwestern University) 4. Protected Values and Omission Bias as Deontological Judgments (Jonathan Baron & Ilana Ritov—University of Pennsylvania & Hebrew University of Jerusalem) 5. Attending to Moral Values (Rumen Iliev, Sonya Sachdeva, Daniel M. Bartels, Craig M. Joseph, Satoru Suzuki, & Douglas L. Medin—Northwestern University & University of Chicago) 6. Instrumental Reasoning Over Sacred Values: An Indonesian Case Study (Jeremy Ginges & Scott Atran—New School for Social Research & University of Michigan) 7. Development and Dual Processes in Moral Reasoning: A Fuzzy-Trace Theory Approach) (Valerie F. Reyna & Wanda Casillas—Cornell University 8. Moral Identity, Moral Functioning, and the Development of Moral Character (Daniel Lapsley & Darcia Narvaez—University of Notre Dame) 9. “Fools rush in”: A JDM Perspective On The Role Of Emotions In Decisions, Moral And Otherwise. (Terry Connolly & David Hardman—University of Arizona & London Metropolitan University) 10. Motivated Moral Reasoning (Peter Ditto, David Pizarro, & David Tannenbaum—University of California-Irvine & Cornell University) 11. In the Mind of the Perceiver: Psychological Implications of Moral Conviction (Christopher W. Baumann & Linda J. Skitka—University of Washington & University of Illinois-Chicago)

Moral Reasoning and Emotion

The Routledge Handbook of Moral Epistemology, 2018

This chapter discusses contemporary scientific research on the role of reason and emotion in moral judgment. The literature suggests that moral judgment is influenced by both reasoning and emotion separately, but there is also emerging evidence of the interaction between the two. While there are clear implications for the rationalism-sentimentalism debate, we conclude that important questions remain open about how central emotion is to moral judgment. We also suggest ways in which moral philosophy is not only guided by empirical research but continues to guide it. Word count: 9,310 (all inclusive)

Moral reasoning: Hints and allegations

Topics in Cognitive Science, 2010

Recent research in moral psychology highlights the role of emotion and intuition in moral judgment. In the wake of these findings, the role and significance of moral reasoning remain uncertain. In this article, we distinguish among different kinds of moral reasoning and review evidence suggesting that at least some kinds of moral reasoning play significant roles in moral judgment, including roles in abandoning moral intuitions in the absence of justifying reasons, applying both deontological and utilitarian moral principles, and counteracting automatic tendencies toward bias that would otherwise dominate behavior. We argue that little is known about the psychology of moral reasoning and that it may yet prove to be a potent social force.