Moral judgment and emotions (original) (raw)

Cognitivism about moral judgement

Oxford Studies in Meta-Ethics vol 10 forthcoming

What is it to make a moral judgement? There are two standard views, cognitivist and non-cognitivist, plus hybrid options according to which moral judgements have cognitivist and non-cognitivist components.In this context, cognitivism is typically defined as the theory that moral judgments are beliefs. This paper is not a defence of cognitivism (or of non-cognitivism). Rather, the aim is to get clearer about what it means for a moral judgment to be a belief. I begin by setting out a tension between three claims: cognitivism; an account of belief, and an account of moral judgement. I think all three claims are plausible, but I will not be defending them here. Rather, my interest is in whether they can be reconciled. In order to do so, I distinguish between broad and narrow belief. I give an example of a mental state that is a broad belief but not a narrow belief, a moral “ulief”. Finally, I set out new definitions of cognitivism and non-cognitivism and draw out some further implications of the argument.

The emotional basis of moral judgments

Philosophical Explorations, 2006

Recent work in cognitive science provides overwhelming evidence for a link between emotion and moral judgment. I review findings from psychology, cognitive neuroscience, and research on psychopathology and conclude that emotions are not merely correlated with moral judgments but they are also, in some sense, both necessary and sufficient. I then use these findings along with some anthropological observations to support several philosophical theories: first, I argue that sentimentalism is true: to judge that something is wrong is to have a sentiment of disapprobation towards it. Second, I argue that moral facts are response-dependent: the bad just is that which cases disapprobation in a community of moralizers. Third, I argue that a form of motivational internalism is true: ordinary moral judgments are intrinsically motivating, and all non-motivating moral judgments are parasitic on these.

Moral Judgment and Decision Making

1. Causal Models: The Representational Infrastructure for Moral Judgment (Steven A. Sloman, Philip Fernbach, & Scott Ewing—Brown University) 2. Moral Grammar and Intuitive Jurisprudence: A Formal Model of Unconscious Moral and Legal Knowledge (John Mikhail—Georgetown University) 3. Law, Psychology & Morality (Kenworthey Bilz & Janice Nadler—Northwestern University) 4. Protected Values and Omission Bias as Deontological Judgments (Jonathan Baron & Ilana Ritov—University of Pennsylvania & Hebrew University of Jerusalem) 5. Attending to Moral Values (Rumen Iliev, Sonya Sachdeva, Daniel M. Bartels, Craig M. Joseph, Satoru Suzuki, & Douglas L. Medin—Northwestern University & University of Chicago) 6. Instrumental Reasoning Over Sacred Values: An Indonesian Case Study (Jeremy Ginges & Scott Atran—New School for Social Research & University of Michigan) 7. Development and Dual Processes in Moral Reasoning: A Fuzzy-Trace Theory Approach) (Valerie F. Reyna & Wanda Casillas—Cornell University 8. Moral Identity, Moral Functioning, and the Development of Moral Character (Daniel Lapsley & Darcia Narvaez—University of Notre Dame) 9. “Fools rush in”: A JDM Perspective On The Role Of Emotions In Decisions, Moral And Otherwise. (Terry Connolly & David Hardman—University of Arizona & London Metropolitan University) 10. Motivated Moral Reasoning (Peter Ditto, David Pizarro, & David Tannenbaum—University of California-Irvine & Cornell University) 11. In the Mind of the Perceiver: Psychological Implications of Moral Conviction (Christopher W. Baumann & Linda J. Skitka—University of Washington & University of Illinois-Chicago)

The Limits of Emotion in Moral Judgment

The Many Moral Rationalisms (eds. K. Jones & F. Schroeter), 2018

I argue that our best science supports the rationalist idea that, independent of reasoning, emotions aren't integral to moral judgment. There's ample evidence that ordinary moral cognition often involves conscious and unconscious reasoning about an action's outcomes and the agent's role in bringing them about. Emotions can aid in moral reasoning by, for example, drawing one's attention to such information. However, there is no compelling evidence for the decidedly sentimentalist claim that mere feelings are causally necessary or sufficient for making a moral judgment or for treating norms as distinctively moral. I conclude that, even if moral cognition is largely driven by automatic intuitions, these shouldn't be mistaken for emotions or their non-cognitive components. Non-cognitive elements in our psychology may be required for normal moral development and motivation but not necessarily for mature moral judgment.

Moral Judgments as Shared Intentions

The Ethics of Wilfrid Sellars, 2018

Chapter 3: Moral Judgments as Shared Intentions 3.1. Why Intentions? 3.1.1. Moral judgments, evidence and motivation David Solomon situates Sellars's ethical project within "a common view of the problematic of ethical theory which is largely shared by Anglo-American moral philosophers from Moore to Hare" (1977, 151). The chief difficulty facing classical metaethics (CM, as Solomon calls it) is a problem that should prove instantly familiar to anyone working in contemporary metaethics. The problem is how to reconcile two different aspects of moral judgment. First, we present evidence for our moral judgments, which entails (a) that moral judgments are belieflike or cognitive, and (b) that moral judgments can be involved in reasoning and logical

How (and where) does moral judgment work

Why do we care so strongly about what other people do, even when their actions won't affect us? And how do we decide that someone else has done something wrong? These questions are at the heart of moral psychology, and psychologists' answers to these questions have changed with intellectual fashion. Historically, psychologists have disagreed about whether moral judgments are primarily products of emotional and non-rational processes (such as Freudian internalization or behaviorist reinforcement) or of reasoning and 'higher' cognition (as in Piaget's and Kohlberg's post-conventional reasoning). Recently, however, findings from several areas of cognitive neuroscience have begun to converge on an answer: emotions and reasoning both matter, but automatic emotional processes tend to dominate.

Affective Intuition and Rule Deployment: The Dénouement of Moral Judgment

What faculty of our mind is best suited to endow us with all that is required to carry forth our moral enterprise? In other words, what are the cognitive resources that subserve the moral mind? This is a core empirical question, raised much to the delight of the investigative inquisitiveness of the moral psychologists.But the philosophical connection to this problem can be traced back to as far in time as that of Plato the main tenet of whose tripartite theory of soul was that the rational element of the soul is like the charioteer who holds sway over his two horses – the manageable one, i.e. the spirited element and the unwieldy one, i.e. the vegetative, emotionally unruly element of the soul. And the era of reason-emotion debate begins, percolating into the field of moral beliefs that we inculcate and judgments that we pronounce. The mainstay of this short paper is a comparative analysis of two recently emerging theoretical frameworks claimed to be underlying moral judgments – one espoused by moral psychologist Jonathan Haidt claims that moral judgment is primarily elicited unconsciously by affectdriven intuition and the other put forward by philosopher Shaun Nichols attempts to highlight the conscious deliberation about moral rules. After critical analysis of both the views, this work suggests that a syncretic approach to the aetiological theorization about moral judgment may provide some silver lining. Keyword: moral judgment, moral dilemma, moral dumbfounding, moral intuition, moral rule.

The Intelligibility of Moral Intransigence: A Dilemma for Cognitivism about Moral Judgment

Analysis, 2018

Many have argued that various features of moral disagreements create problems for cognitivism about moral judgment, but these arguments have been shown to fail. In this paper I articulate a new problem for cognitivism that derives from features of our responses to moral disagreement. I argue that cognitivism entails that one of the following two claims is false: (1) a mental state is a belief only if it tracks changes in perceived evidence; (2) it is intelligible to make moral judgments that do not track changes in perceived evidence. I explain that there is a good case that (1) holds such that we should prefer theories that do not entail the negation of (1). And I argue that the seeming intelligibility of entirely intransigent responses to peer disagreement about moral issues shows us that there is a good case that (2) holds.