moral dilemmas and moral principles: when emotion and cognition unite

17
This article was downloaded by: [VUL Vanderbilt University] On: 29 April 2013, At: 01:41 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Cognition & Emotion Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/pcem20 Moral dilemmas and moral principles: When emotion and cognition unite Andrea Manfrinati a , Lorella Lotto a , Michela Sarlo b , Daniela Palomba b & Rino Rumiati a a Department of Developmental Psychology and Socialisation (DPSS), University of Padova, Padova, Italy b Department of General Psychology (DPG), University of Padova, Padova, Italy Version of record first published: 24 Apr 2013. To cite this article: Andrea Manfrinati , Lorella Lotto , Michela Sarlo , Daniela Palomba & Rino Rumiati (2013): Moral dilemmas and moral principles: When emotion and cognition unite, Cognition & Emotion, DOI:10.1080/02699931.2013.785388 To link to this article: http://dx.doi.org/10.1080/02699931.2013.785388 PLEASE SCROLL DOWN FOR ARTICLE Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. The publisher does not give any warranty express or implied or make any representation that the contents will be complete or accurate or up to date. The accuracy of any instructions, formulae, and drug doses should be independently verified with primary sources. The publisher shall not be liable for any loss, actions, claims, proceedings, demand, or costs or damages whatsoever or howsoever caused arising directly or indirectly in connection with or arising out of the use of this material.

Upload: rino

Post on 08-Dec-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

This article was downloaded by: [VUL Vanderbilt University]On: 29 April 2013, At: 01:41Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registered office:Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Cognition & EmotionPublication details, including instructions for authors and subscriptioninformation:http://www.tandfonline.com/loi/pcem20

Moral dilemmas and moral principles: Whenemotion and cognition uniteAndrea Manfrinati a , Lorella Lotto a , Michela Sarlo b , Daniela Palomba b &Rino Rumiati aa Department of Developmental Psychology and Socialisation (DPSS),University of Padova, Padova, Italyb Department of General Psychology (DPG), University of Padova, Padova,ItalyVersion of record first published: 24 Apr 2013.

To cite this article: Andrea Manfrinati , Lorella Lotto , Michela Sarlo , Daniela Palomba & Rino Rumiati(2013): Moral dilemmas and moral principles: When emotion and cognition unite, Cognition & Emotion,DOI:10.1080/02699931.2013.785388

To link to this article: http://dx.doi.org/10.1080/02699931.2013.785388

PLEASE SCROLL DOWN FOR ARTICLE

Full terms and conditions of use: http://www.tandfonline.com/page/terms-and-conditions

This article may be used for research, teaching, and private study purposes. Any substantialor systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, ordistribution in any form to anyone is expressly forbidden.

The publisher does not give any warranty express or implied or make any representation that thecontents will be complete or accurate or up to date. The accuracy of any instructions, formulae,and drug doses should be independently verified with primary sources. The publisher shall notbe liable for any loss, actions, claims, proceedings, demand, or costs or damages whatsoever orhowsoever caused arising directly or indirectly in connection with or arising out of the use of thismaterial.

Moral dilemmas and moral principles:When emotion and cognition unite

Andrea Manfrinati1, Lorella Lotto1, Michela Sarlo2, Daniela Palomba2, andRino Rumiati1

1Department of Developmental Psychology and Socialisation (DPSS), University of Padova, Padova,Italy2Department of General Psychology (DPG), University of Padova, Padova, Italy

Traditional studies on moral judgement used resolutions of moral dilemmas that were framed interms of acceptability of the consequentialist action promoting a greater good, thus overlooking thedeontological implications (choices cannot be justified by their consequences). Recently, someauthors have suggested a parallelism between automatic, unreflective emotional responses anddeontological moral judgements. In this study, we developed a novel experimental paradigm in whichparticipants were required to choose between two resolutions of a moral dilemma (consequentialistand deontological). To assess whether emotions are engaged in each of the two resolutions, we askedparticipants to evaluate their emotional experience through the ratings of valence and arousal. Resultsshowed that emotion is involved not only in deontological but also in consequentialist resolutions.Moreover, response times pointed out a different interplay between emotion and cognition indetermining a conflict in the dilemma’s resolution. In particular, when people were faced withtrolley-like dilemmas we found that decisions leading to deontological resolutions were slower thandecisions leading to consequentialist resolutions. We propose that this finding reflects the special(but not accepted) permission provided by the doctrine of the double effect for incidentally causingdeath for the sake of a good end.

Keywords: Moral dilemmas; Judgement and decision making; Emotions; Intentionality; Doctrine ofthe double effect.

The fMRI studies that Greene and colleagues

have conducted on moral judgement (Greene,

Nystrom, Engell, Darley, & Cohen, 2004; Greene,

Sommerville, Nystrom, Darley, & Cohen, 2001)

deserve the considerable attention they have

received from the community of ethicists and

psychologists, because they have opened a rich

discussion on the neural activity associated with

different types of moral reasoning. The main

point of Greene’s studies was the identification

of two different brain processes, each associated

with a particular type of moral thinking. In the

fMRI studies (Greene et al., 2001, 2004),

participants were presented with a set of moral

Correspondence should be addressed to: Andrea Manfrinati, Department of Developmental Psychology and Socialisation

(DPSS), University of Padova, Via Venezia 8, I-35131, Padova, Italy. E-mail: [email protected]

COGNITION AND EMOTION, 2013

http://dx.doi.org/10.1080/02699931.2013.785388

1# 2013 Taylor & Francis

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

dilemmas that was classified as personal andanother set that was classified as impersonal.Moral dilemmas were considered personal if themoral violation met three criteria: first, theviolation must be likely to cause serious bodilyharm or death; second, this harm must happen toa person or group of persons; and third, the harmmust not result from the deflection of an existingthreat onto another person (the footbridge dilem-ma resembles these criteria). Otherwise, if di-lemmas caused non-serious physical harm to aperson or a set of people, or only requireddiverting some pre-existing threat onto a differ-ent party rather than initiating the harm oneself,the dilemmas were classified as impersonal (thetrolley dilemma resembles these criteria). Theresults revealed that when participants consid-ered personal dilemmas, they showed increasedactivity in brain areas associated with emotionand social cognition and decreased activity inareas associated with working memory andabstract reasoning than when they consideredimpersonal dilemmas or non-moral dilemmas(i.e., dilemmas where there are no moral im-plications for the agent). On the contrary,impersonal dilemmas elicited less emotionalactivity and more activity in the cognitive areasof the brain, as compared to personal dilemmasand non-moral dilemmas.

The behavioural and neuroimaging resultsallowed Greene et al. (2001, 2004) to propose adual process theory in which two different patternsof neural activity are involved in moral judgement:a fast, unconscious, and effortless emotional sys-tem, and a slow, conscious, and effortful cognitivesystem. In this view, the emotional system refersto a class of processes that are valenced, quick,and automatic, though not necessarily conscious.The cognitive system, in contrast, refers to con-trolled processes that are important for rea-soning, planning, manipulating information inworking memory, controlling impulses or, moregenerally, to higher executive functions (Greene,2008).

On these bases, Greene (2008) suggesteda parallelism between automatic, unreflectiveemotional responses and deontological moral

judgements. More precisely, Greene claimedthat the prepotent negative emotional responsethat drives people to disapprove the personallyharmful actions proposed in cases like thefootbridge dilemma are characteristic of deonto-logy but not of consequentialism, and that theconsequentialist judgements are those mostclosely associated with higher cognitive func-tions, such as executive control (Greene, 2008).From a consequentialist point of view, actionsare wrong because of their harmful conse-quences, but the harm could be acceptable forthe sake of promoting a greater good. In otherterms, whether an act is morally right dependsonly on the consequences of that act or ofsomething related to that act. On the contrary,from a deontological point of view, it is wrong toperform harmful actions even though perform-ing them will maximise good consequences, i.e.,some choices are morally forbidden no matterhow morally good their consequences are. Fromthis perspective, what makes a choice right is itsconformity with a moral norm: the Right haspriority over the Good. Greene (2008) recogniseddeontology as a natural cognitive expression ofour deepest moral emotions. He believed thatthe deontological theory is a post hoc rationalisa-tion of emotional reactions: ‘‘essentially, [deon-tology] is an attempt to produce rationaljustifications for emotionally driven moral jud-gements, and not an attempt to reach moralconclusions on the basis of moral reasoning’’(Greene, 2008, p. 39).

In our view, Greene’s account for a deontolo-gical moral theory is controversial, because it isunclear whether his argument counts againstdeontology on the whole or only against someaspects of a deontological theory (see also Dean,2010). Furthermore, even though there are manystudies targeting the non-consequentialist ele-ments of moral behaviour (Baron, 1994; Damasio,1994; Haidt, 2001; Schweder & Haidt, 1993), inthe traditional studies on moral judgement thathave used moral dilemmas participants were askedto evaluate the moral acceptability of a particularsituation that was always framed as the conse-quentialist resolution of the moral dilemma,

MANFRINATI ET AL.

2 COGNITION AND EMOTION, 2013

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

overlooking the deontological implications. In-deed, when people evaluated the situation asinappropriate or morally unacceptable (especiallyin the personal dilemmas), then the experimenterinferred that people were endorsing a deontologi-cal ethic and that they had chosen a deontologicalresolution of the moral dilemma. Here, we wouldlike to turn to some evidence that this conclusion istoo strong. First, if people judge as inappropriateor morally unacceptable a state of affairs thatsubtend a consequentialist ethic, we cannot inferthat people would choose the deontological reso-lution of the moral dilemma or that they con-sidered a deontological perspective acceptable. Forexample, people could consider pushing oneperson onto train tracks in order to save the fiveworkmen in the footbridge dilemma as inappropri-ate, because they would evaluate the trade-off 1versus 5 as ‘‘unsatisfactory’’ from an utilitarianpoint of view, and not because of a deontologicalethic. In fact, it is possible that the sameparticipants could evaluate the trade-off 1 versus1,000 as satisfactory and, accordingly, the con-sequentialist resolution as appropriate (Nichols &Mallon, 2006).

Second, although Greene (2008) claims thatdeontology is more emotional while consequenti-alism is more cognitive, it is plausible to assumethat deontological judgements, in addition toemotional responses, also implicate a set ofcognitive processes, such as representations ofrules and cost-benefit analyses (Nichols & Mal-lon, 2006). As stated by Dean (2010), not alldeontological duties are associated with strongemotional reactions. Indeed, Borg, Hynes, VanHorn, Grafton, and Sinnott-Armstrong (2006)found that the activation of the emotion-relatedareas of the paralimbic system was not associatedwith a higher frequency of characteristicallydeontological resolutions, and that some deonto-logical responses were associated both with cog-nitive and emotional-related neural activity.

In the most recent formulation of his theory(Cushman, Young, & Greene, 2010), Greenerecognises that the association between emotionand deontology, on the one hand, and consequen-tialism and controlled cognition, on the other, is

overly simple. In order to capture the differencebetween deontological intuitions and consequen-tialist reasoning, Cushman et al. (2010) haveproposed a distinction between alarm-bell emo-tions and currency emotions. The former aredesigned to bypass reasoning and are aimed todominate the decision rather than to merelyinfluence it: such an emotional response is likean alarm bell because it makes a clear demand thatis extremely difficult to ignore. The currencyemotions are designed to participate in the processof practical reasoning and are aimed to provide amore ‘‘cognitive’’ way to decide, perhaps contri-buting to a cost-benefit analysis. The ‘‘currencyversus alarm-bell’’ proposal seems an elaborationof what was asserted in Greene (2008) andalthough this is just one of the hypotheses thatare considered by Cushman et al. (2010), theauthors are committed to the view that theseemotional processes are fundamentally differentand the emotions underlying deontological judge-ments stronger. However, as far as we know, thereare no studies investigating the cognitive/emo-tional responses associated with consequentialistas compared with deontological judgements.

In the present study, we developed a noveldifferent experimental paradigm in which partici-pants were explicitly required to choose betweentwo possible resolutions of a moral dilemma, onedeontological and the other consequentialist. Asanother novel feature of the present study, weasked participants to rate their emotional experi-ence during moral decision making. Past studieshave inferred emotion from the activation of brainareas commonly associated with emotional pro-cessing (Borg et al., 2006; Greene et al., 2001,2004) or by using an a priori criterion forconsidering some dilemmas as ‘‘putatively moreemotional’’ than others (Greene et al., 2001,2004). In the present study, we assessed self-reported emotional experience by collecting va-lence and arousal ratings through the process of thedilemma’s resolution. According to the circumplexmodel of affect (Posner, Russell, & Peterson,2005; Russell, 1980; Russell, Weiss, & Mendel-sohn, 1989), all emotional experiences derive froma combination of these two basic underlying

MORAL DILEMMAS AND MORAL PRINCIPLES

COGNITION AND EMOTION, 2013 3

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

dimensions. Valence refers to the hedonic tone ofthe experienced emotion, which may range fromhighly unpleasant to extremely pleasant. Arousalreflects a subjective state ranging from activated todeactivated, thus referring to a sense of mobilisa-tion or energy. This conceptual framework hasbeen adopted in a variety of research contexts,with increasing evidence across different domainsof psychology, psychophysiology and neurosciencesupporting the hypothesis that ‘‘core’’ affect, asdefined by the degrees of valence and arousal, is abasic component of emotional experience andresponding (Duncan & Barrett, 2007; Lang,Greenwald, Bradley, & Hamm, 1993; Russell,2003). In other words, ‘‘core affect is what makesany event ‘hot’ (i.e., emotional)’’ (Russell, 2003, p.148). On these bases, we believed that such anapproach to the study of emotion would beparticularly suited for capturing the subjectiveaffective state associated with the emotionalresponses invoked by Greene and colleagues inthe context of moral judgement, as no discretecategories of emotional states have been hypothe-sised. In this way, we could assess whether and towhat extent conscious emotion is engaged duringthe process of decision that will lead to the choiceof one of the two resolutions (consequentialist ordeontological).

Furthermore, as in previous studies on moraljudgement, we measured participants’ responsetimes to dilemmas. The new paradigm allowedus to collect response times for both consequenti-alist and deontological resolutions, thus clarifyingthe interplay between cognitive and emotionalprocesses. In the literature, slower response timeshave been interpreted as reflecting a conflictbetween negative emotional responses and cogni-tive processes (Greene et al., 2001, 2004). Indeed,Greene et al. (2001) showed that judgementsapproving of ‘‘personal’’ harmful actions tooklonger than judgements disapproving of thoseactions, arguing that the utilitarian resolutionrequires the engagement of cognitive control inorder to inhibit the intuitive emotional response.However, McGuire, Langdon, Coltheart, andMackenzie (2009) have cast doubt on this inter-pretation based on their reanalysis of Greene

et al.’s (2001) data, showing that the effectreported by Greene et al. (2001) was an artefact,being driven only by a few particular dilemmas.For these reasons, McGuire et al. (2009) recom-mended the use of a more rigorously controlled setof stimuli.

In light of this suggestion, we designed a new setof dilemmas in which the role of intentionality washighlighted. The ascription of intentionality to anagent and its relation with moral judgement has beenextensively debated in ethics and moral psychology(Borg et al., 2006; Cushman, Young, & Hauser,2006; Hauser, Cushman, Young, Jin, & Mikhail,2007; Mikhail, 2002; Sinnot-Armstrong, Mallon,McCoy, & Hull, 2008). One of the traditional issuesis whether moral judgements of an act are affected bywhether the act involves intentional harm ratherthan only foreseen harm, and whether it is permis-sible to bring about as a merely foreseen side effect aharmful event that it would be impermissible to bringabout intentionally. The doctrine of double effect(DDE) was often invoked while trying to explainthese issues (Aquinas, 1265�1272/1947). In thetrolley versus footbridge dilemmas, the DDE in-tegrates a ‘‘permission’’ for incidentally causing deathfor the sake of a good end (when it occurs as a sideeffect of one’s pursuit of that end) with a prohibitionof instrumentally causing death for the sake of a goodend (when it occurs as a part of one’s means to pursuethat end). Many experimental studies have shownthat the DDE affects people’s moral judgements(Borg et al., 2006; Cushman et al., 2006; Greeneet al., 2009; Hauser et al., 2007; Mikhail, 2002; butsee also Moore, Clark, & Kane, 2008; Scanlon,2008; Waldmann & Dieterich, 2007, for a differentperspective).

To sum up, the aims of the present study weretwofold. First, we aimed to investigate to whatextent emotional engagement is involved in con-sequentialist and deontological resolutions. Bymeasuring emotional experience through its coreindependent dimensions (valence and arousal), itwas possible for us to carefully explore the sub-jective feeling side of the emotional state associatedwith decision making leading to both types ofresolutions. In footbridge-like dilemmas, we ex-pected that decisions leading to consequentialist

MANFRINATI ET AL.

4 COGNITION AND EMOTION, 2013

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

resolutions would be rated as more unpleasant andarousing than those leading to deontologicalresolutions. Indeed, on the basis of Greene’s(2008) and Cushman et al.’s (2010) proposals, theconsequentialist resolutions have to override theintuitive emotional response that says ‘‘No! Thisaction is wrong!’’, thus evoking higher emotionalresponses associated with decisions leading toconsequentialist judgements in comparison withdeontological ones. By contrast, we expected nodifferences between consequentialist and deonto-logical resolutions in trolley-like dilemmas. Inthese cases, in fact, the DDE allows people tocause the death of a human being as a side effect ofpromoting some good end. Therefore, whateverthe decision, emotional responses associated toeither consequentialist or deontological resolutionsshould not differ between them. Finally, weexpected higher emotional responses associatedwith footbridge-like dilemmas in comparisonwith trolley-like dilemmas.

Our second aim was to clarify the interplaybetween emotion and cognition in determining aconflict in the dilemmas’ resolutions, as indexed bythe participants’ slower response times. Accordingto Greene’s (2008) perspective, when decisionmaking is characterised by a higher emotionalengagement (as in footbridge-like dilemmas)slower response times for consequentialist resolu-tions would be expected as compared to deonto-logical resolutions. Indeed, if Greene’s conclusionis correct, utilitarian judgements would be drivenby controlled cognitive processes, the engagementsof which would result in longer response times. Asfor trolley-like dilemmas, although Greene (2008)stated that there are no reasons to predict differ-ences in response times between deontological andconsequentialist choices because there is no emo-tional response to override in such cases, wehypothesised slower response times for deontolo-gical resolutions as result of the interfering effect ofthe DDE. In cases like the trolley dilemma, in fact,it is permissible to cause a serious harm to a humanbeing as a side effect, but in those cases in whichpeople do not avail themselves of this specialpermission, a slowdown in response times shouldbe observed when compared with consequentialist

judgements. By collecting response times for bothconsequentialist and deontological resolutions, itwas also possible to clarify the role of DDE withina deontological perspective. Indeed, previous stu-dies (Borg et al., 2006; Cushman et al., 2006;Greene et al., 2009; Hauser et al., 2007; Mikhail,2002) have not sufficiently considered that theDDE concerns only a deontological perspectiveand has nothing to do with the consequentialism.In fact, if the permissibility of an action dependedonly on the consequences of the action itself (as inconsequentialism), then the distinction thatgrounds double effect would not have the moralsignificance claimed for it. For these reasons, wehypothesised no significant difference in responsetimes between incidental and instrumental dilem-mas for consequentialist resolution, because con-sequentialism considers only the consequences anddoes not take into account the way in which theseconsequences are achieved.

METHOD

Subjects

Thirty-six undergraduates (16 males) were re-cruited from the University of Padova. Partici-pants were aged 19�28 years (Mage�23.7 years,SD�1.9). The study was approved by the localEthics Committee and all volunteers gave writtenconsent prior to participation.

Design, materials and procedure

We created dilemmas based on the ‘‘inner’’structure exemplified by the trolley and footbridgescenarios that have been discussed in contempor-ary moral philosophy (Foot, 1967; Thomson,1986) and in the well-known psychological stu-dies on moral judgement (Greene et al., 2001,2009). In this study, we adapted some dilemmasfrom Greene et al. (2001) and Cushman et al.(2006), but for the most part, we invented newdilemmas in order to overcome some confoundsthat affected the original material used by Greeneand colleagues (e.g., several dilemmas were‘‘non-dilemmas’’ because there were no conflicts

MORAL DILEMMAS AND MORAL PRINCIPLES

COGNITION AND EMOTION, 2013 5

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

between two actions or two obligations). All thescenarios captured the distinction between instru-

mental dilemmas and incidental dilemmas. Inparticular, the instrumental dilemmas require thedecision to kill a person as a means to savemore people, whereas in the incidental dilemmas,killing one person to save more people was aforeseen but unintended consequence of theaction (for examples, see Table 1).

Participants were presented with 60 experimen-tal dilemmas: 30 instrumental dilemmas and 30incidental dilemmas. The subject was the protago-

nist in all dilemmas. Instrumental and incidentaldilemmas were matched for numerical conse-quences (i.e., the number of people to save or letdie) and self-involvement (i.e., to save or let oneselfdie besides other people). Furthermore, in order toavoid automaticity in responding to conceptuallysimilar issues, participants were presented with 12filler dilemmas, which were similar to the experi-mental dilemmas except that they involved nodeaths. The mean number of words and number oftext characters were fully balanced between instru-mental and incidental dilemmas (see Table 1).

Table 1. Text of sample dilemmas (text translated from Italian)

Condition Scenario Resolution A Resolution B

Incidental (self-

involvement)

You are the commander of a group of five

astronauts in a space station orbiting the

earth. Because of a breakdown, you have

discovered a serious loss of pressurisation

which in a short time will lead to the

explosion of the space station. The

emergency system is broken down and

cannot be repaired immediately.

You let the loss of

pressurisation lead in a short

time to the explosion of the

entire station, causing all six

to die.

You activate the bulkheads

manually to isolate the

depressurisation to just one

cabin. You know that there is

one astronaut there, and he will

suffocate from a lack of oxygen,

but you and the other four will

be saved.

Incidental (no self-

involvement)

You are the late-night watchman of a

hospital. Due to an accident in the

laboratory, there are deadly fumes flowing

through the ventilation system into a room

containing five patients. In another room,

there is only one patient.

You let the deadly fumes flow

into the room containing the

five patients, which causes

their deaths.

You hit a switch that permits

the fumes to bypass the room

with five patients. You know

that the fumes will enter the

room with the single patient,

causing his death, but the other

five patients will be saved.

Instrumental (self-

involvement)

You are the pilot of a helicopter taking

four people on a sightseeing trip in the

French Alps, at an altitude of about 2,000

metres. At a certain point, the helicopter

loses altitude and no longer responds to

the commands. You send out a mayday

signal and try various manoeuvres, but the

helicopter continues to fall and risks

crashing into a rock face.

Not recovering altitude due

to the failure of the control,

let the helicopter precipitate.

You and the four people will

die on impact.

To gain altitude, you push the

person next to you out of the

helicopter. You know that this

person will plummet to the

ground and will die, but you

and the other three will be

saved.

Instrumental (no

self-involvement)

You are a young doctor working in the

emergency surgery ward of your local

hospital. Four of the patients under your

care in your ward are dying from grave

injury to vital organs as a result of a serious

car accident. You have another patient in

the ward, who is now in good health and

ready to be sent home.

Lacking a list of compatible

donors, you let the four

patients die.

You anaesthetise the patient in

good health and remove his

vital organs to transplant them

in the patients who are dying.

You know that he will die, but

the other four can be saved.

Notes: Mean number of words and number of text characters were fully balanced between Instrumental and Incidental scenarios (Mwords�59.13 and 59.13); t(58)�0.00, p�1.00; (Mcharacters�352.10 and 352.57); t(58)��0.09, p�.93); options A (Mwords�19.33 and

19.13); t(58)�0.89, p�.38; (Mcharacters�115.17 and 113.17); t(58)�0.98, p�.33; options B (Mwords�30.87 and 30.87); t(58)�0.00,

p�1.00; (Mcharacters�172.77 and 178.97); t(58)��1.52, p�.13.

MANFRINATI ET AL.

6 COGNITION AND EMOTION, 2013

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

We tested all participants individually. Dilem-mas were presented on a computer monitor andconsisted of an introductory paragraph thatappeared alone until subjects pressed the spacebar,which then revealed to them two successive slideswith the two possible resolution sentences: adeontological (DEO) resolution of the dilemma(labelled with the letter A) and a consequentialist

(CON) resolution of the dilemma (labelled withthe letter B). After option B offset, the letters Aand B were presented vertically aligned at thecentre of the screen, separated by a fixation cross(decision slide). Participants’ task was to choose oneof the two resolutions for each moral dilemma bypressing one of the two keys (A or B) by using thesame hand. We recorded participants’ responsechoices and response times at the onset of thedecision slide (see Figure 1). In order to dis-criminate the reading time of the two optionsfrom the time of the subjects’ decision, each slideremained on the screen for a fixed time deter-mined by the length (in words) of the two optionsand by the average reading times of the twooptions recorded in a pilot study. Specifically,

option A remained on the screen for 4,500 ms,and option B for 6,500 ms. Note that options Awere always shorter than options B. After eachresponse, we collected ratings of valence andarousal experienced during decision making byusing two bipolar-scales (ranging from 1 to 9) ofthe Self-Assessment Manikin (SAM; Lang, 1980).Participants were required to rate how they feltwhile they were deciding. Two practice dilemmaswere completed before beginning the experimen-tal trials. The order in which dilemmas werepresented to each participant was randomised.After the experiment, participants completed adebriefing questionnaire and were informed aboutthe hypotheses of the study.

Data analyses

Since we used many novel stimuli in the set of theexperimental dilemmas, we ran an item analysis(F2) in addition to a subject analysis (F1), assuggested by McGuire et al. (2009), in order toensure that the results obtained in the subjectanalysis were generalisable to the populations of

Figure 1. Sequence of events in the experiment. Participants had to decide between options A and B by pressing the corresponding key

during the presentation of the decision slide (in grey). SAM �Self-Assessment Manikin; ITI � inter-trial interval.

MORAL DILEMMAS AND MORAL PRINCIPLES

COGNITION AND EMOTION, 2013 7

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

incidental and instrumental dilemmas underinvestigation.

At first, we analysed the distribution ofdeontological and consequentialist resolutions forthe two type of dilemmas, incidental and instru-mental. Next, for both the subject and itemanalyses, we ran three separate analyses of variance(ANOVAs) on response times (RTs), valence, andarousal ratings. Type of Resolution (deontologicalvs. consequentialist) and Type of Dilemma (in-cidental vs. instrumental) were within-subjectsfactors in the analysis by subjects (F1), while inthe analysis by item (F2) a Type of Response was awithin-subjects factor and Type of Dilemma was abetween-subjects factor. In the subject analysis,we considered only those subjects who gave atleast one response in all cells of the experimentaldesign (N�34).

RESULTS

Distribution of deontological andconsequentialist responses

As expected (see Figure 2), the incidental dilem-mas elicited a significantly higher number of

CON responses than DEO responses, x2(1) �379.259, pB.001 (860 vs. 220, respectively).On the contrary, in the instrumental dilemmasthe number of DEO responses were higherthan CON responses, x2(1) �43.200, pB.001(648 vs. 432, respectively).

Response times

Type of Dilemma was significant, with decidingon incidental dilemmas slower than on instru-mental dilemmas, F1(1, 33) �5.451, p�.026,h2�.142; F2(1, 58) �7.567, p�.008, h2�.115. Type of Resolution was significant only inthe item analysis, F1(1, 33) �0.693, p�.411,h2�.021; F2(1, 58) �6.261, p�.015, h2�.097,with decision times for deontological resolutionsslower than for consequentialist resolutions. Theanalyses also showed a significant interactionbetween the two factors, F1(1, 33) �8.283, p�.007, h2�.201; F2(1, 58) �16.818, pB.001,h2�.225 (see Figure 3). Response times toincidental dilemmas were slower when partici-pants chose the deontological resolution thanwhen they chose the consequentialist resolution(Ms �2,978 ms vs. 2,574 ms, respectively; Tukey

Figure 2. Distribution of response choices as a function of type of resolution (consequentialist/CON vs. deontological/DEO) and type of

dilemma (instrumental vs. incidental).

MANFRINATI ET AL.

8 COGNITION AND EMOTION, 2013

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

post hoc, p�.05). In contrast, instrumentaldilemmas yielded no significant difference inRTs between deontological and consequentialistresolutions (Ms �2,676 ms vs. 2,460 ms, respec-tively). When focusing on type of resolution, wefound that deontological resolutions were fasterwith the instrumental dilemmas than with theincidental dilemmas (Ms �2,460 ms vs. 2,978ms, respectively; Tukey post hoc, p�.009),whereas consequentialist resolutions yielded nosignificant difference in RTs between instrumen-tal and incidental dilemmas (Ms �2,676 ms vs.2,574 ms, respectively).

Valence and arousal

The analysis of valence showed a significant effectof Type of Dilemma, F1(1, 33) �11.355, p�.002, h2�.256; F2(1, 58) �8.914, pB.004,h2�.133, with decisions for the instrumentaldilemmas rated as more unpleasant than decisionsfor the incidental dilemmas (2.32 vs. 2.53,respectively). The Type of Resolution did notreach significance in the subject or in the itemanalysis (see Table 2), F1(1, 33) �0.111, p�.741,h2�.003; F2(1, 58) �0.459, p�.501, h2�.008,

with the valence ratings that not significantly

differ in decisions for consequentialist and deon-

tological resolutions (2.41 vs. 2.44, respectively).

It is worth noting that all mean values were clearly

at the unpleasantness end of the scale (i.e., they

were largely lower than the neutral midpoint,

ranging from 4.5 to 5.5; e.g., Bradley & Lang,

2000).The analysis of arousal showed a significant

effect of Type of Resolution only for the item

analysis (see Table 2), F1(1, 33) �1.666, p�.206,

h2�.048; F2(1, 58) �5.253, p�.026, h2�.083,

with decisions for consequentialist resolutions

evaluated as more arousing than decisions

for deontological resolutions (6.20 vs. 5.95,

respectively).

Figure 3. Mean response times (in milliseconds) for instrumental and incidental dilemmas as a function of type of resolution

(consequentialist/CON vs. deontological/DEO).

Table 2. Means (and standard errors) of valence and arousal

ratings for deontological and consequentialist resolutions

Deontological

resolutions

Consequentialist

resolutions

Valence ratings 2.44 (0.22) 2.41 (0.22)

Arousal ratings 5.95 (0.10) 6.20 (0.06)

MORAL DILEMMAS AND MORAL PRINCIPLES

COGNITION AND EMOTION, 2013 9

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

DISCUSSION

The aim of the present study was to introduce adifferent experimental paradigm in which partici-pants had the opportunity to choose between aconsequentialist and a deontological resolution ofa moral dilemma, in order to investigate to whatextent emotional engagement is involved in bothresolutions and clarify the interplay betweenemotion and cognition in moral conflicts.

At a more general level, our results showed thatinstrumental dilemmas elicited a lower number ofconsequentialist responses than incidental dilem-mas, indicating that for participants it was lesspermissible to kill one individual as an intendedmeans to save others than as a foreseen butunintended consequence of saving others. Con-versely, instrumental dilemmas elicited a highernumber of deontological responses than incidentaldilemmas. Instrumental and incidental dilemmasdiffer in the agent’s intention, which allows peopleto evaluate the causes and consequences of theiractions. Such intentions define what people setout to achieve through their actions, and theirintended ends and their intended means are thefactors that principally define moral actions(Mikhail, 2002).

At a more specific level, the results of thepresent study generated two central conclusions: (1)emotion, as measured by its core affective feelings,is involved in decision making leading not only tothe deontological but also to the consequentialistresolutions of moral dilemmas; and (2) results onresponse times pointed out a different interplaybetween emotion and cognition in determining aconflict in the dilemma’s resolution.

Regarding the first conclusion, an interestingnovel feature of the present study was themeasurement of the self-reported emotional stateexperienced when deciding between the twodilemmas’ resolutions. We assessed the emoti-onal experience along the two dimensions ofvalence (pleasantness/unpleasantness) and arousal(activation/deactivation), which, according to thecircumplex model of affect, represent the coreaffective features defining the subjective emo-tional states, typically accounting for most of the

variance in emotional judgements (Bradley &Lang, 1994; Lang et al., 1993; Russell, 2003).As for type of dilemmas, our results showed that:(i) both instrumental and incidental dilemmaselicited negative affect; and (ii) decisions oninstrumental dilemmas were rated as significantlymore unpleasant than those on incidental dilem-mas. We could surmise that participants perceivedmaking decisions on instrumental dilemmas asparticularly aversive because these dilemmas re-quired the use of a person as a means to an end, incontradiction with the Kantian categorical im-perative (‘‘Act in such a way that you treathumanity, whether in your own person or in theperson of any other, always at the same time as anend and never merely as a means to an end’’; Kant,1785/1959). On the contrary, we could infer thatparticipants perceived decisions to incidentaldilemmas as less unpleasant than decision toinstrumental dilemmas probably because theyperceived the sacrifice of a person as an unintendedconsequence of their actions. These results seemin accordance with Cushman et al.’s (2010) alarmbell hypothesis, in which the primary motivationnot to ‘‘intentionally’’ harm is ultimately derivedfrom the alarm bell emotional system that objectsto actions like using an individual’s death as ameans to an end. These findings are also con-sistent with the evidence provided by Koenigs etal. (2007) on the rated emotional salience ofpersonal versus impersonal moral dilemmas.

The comparison between the affective ratingsobtained as a function of the two types ofresolutions showed that both consequentialistand deontological resolutions were associatedwith negative affect experienced during decisionmaking. Therefore, our predictions were partiallyconfirmed. Unexpectedly, decisions leading toconsequentialist resolutions were rated as unplea-sant as those leading to deontological resolutionsnot only in incidental but also in instrumentaldilemmas. We expected a different pattern ofresults for instrumental and incidental dilemmas,with decisions leading to consequentialist resolu-tions being rated as more unpleasant than thoseleading to deontological resolutions in the instru-mental dilemmas. It could be argued that a floor

MANFRINATI ET AL.

10 COGNITION AND EMOTION, 2013

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

effect was operating for both types of dilemmas,given that both their mean values were markedlyin the unpleasantness end of the scale. It couldalso be suggested that whatever decision is taken,instrumental dilemmas elicit a strong negativeemotional response, as in these dilemmas a personis intentionally used as a mean to an end. Finally,independent from type of dilemma, decisionsleading to consequentialist resolutions were foundto be rated as more arousing than those leading todeontological resolutions. As this finding wassignificant only in the item analysis, it should beviewed with caution. However, as the lack ofsignificance in the subject analysis could be due tothe lower number of observations as comparedwith the item analysis, we believe that this findingdeserves our attention.

Taken together, the results on valence andarousal provide further evidence on the role ofemotions in moral judgement. Although Greene(2008) and Cushman et al.’s (2010) claims thatdeontological judgement is affective at its corewhile consequentialist judgement is essentiallycognitive, they are also inclined to agree, in aHumean vein, that consequentialist moral judge-ments must have some affective components, andthat the consequentialist weighing of costs andbenefits is an emotional process (Greene, 2008).Our results provide empirical support to thisclaim, showing that decisions leading to conse-quentialist resolutions were rated as more arousingthan those leading to deontological resolutions,with both decisions rated as highly unpleasant.Thus, affective processes seem to play a relevantrole in defining not only deontological but alsoconsequentialist perspectives. Further investiga-tions are needed to better characterise the emo-tional processes involved in both deontologicaland consequentialist decisions. Here, we are not inthe position to provide an unambiguous explana-tion for these interesting results regarding emo-tional experience, or to suggest a causalrelationship to resolution choices. However, wewould like to suggest a possible interpretation.Given that moral dilemmas are formally undecid-able by definition, an agent ‘‘forced’’ to choosebetween two possible resolutions perceives them

as highly conflicting and unpleasant, indepen-dently of which decision s/he will choose andindependently of the trade-off between cost andbenefit associated with the two resolutions (Brink,1994; Macintyre, 1990). The agent experiencesthe formal impossibility of resolving the dilemma,and s/he recognises that the two obligations (notkilling and helping others) are both ‘‘right’’, buts/he is forced to choose one while still consideringthe rejected alternative as valuable, thus reaffirm-ing the impossibility of resolving the dilemma(Brink, 1994). On the other hand, when peoplechoose a consequentialist resolution they mightconsider the consequences of an act as relevant indetermining its morality, but they most likely feelthat this resolution could undermine their moralintegrity, thus evoking an unpleasant emotionalfeeling characterised by a sense of higher mobi-lisation and energy. If the consequentialist judge-ment engages controlled reasoning processes toconstruct a set of practical principles for our moralbehaviour (Cushman et al., 2010), then the wholeprocess might have a high emotional cost thatyields to a feeling of displeasure and high energyexpenditure. As an alternative explanation, wemight suggest that decisions leading to conse-quentialist judgements evoke higher emotionalarousal in comparison to deontological judge-ments because in the former a person is requiredto actively intervene by performing a difficultaction, whereas in the latter the events follow theirnatural course.

Regarding the second conclusion, our resultsmay clarify the interaction between emotional andcognitive processes in determining a conflict inthe dilemma’s resolution. Greene (2008) andCushman et al. (2010) claimed that people’s moraljudgements appear to be the result of at least twodifferent kinds of psychological processes. Inaddition to brain imaging results, Greene et al.(2001) provided data on response times, showinga slowdown in the response times when peopleconsidered as morally acceptable personal, inten-tionally harmful violations characterised by ahigher emotional engagement (i.e., instrumentaldilemmas). However, Greene et al.’s (2001)results on response times were based on the

MORAL DILEMMAS AND MORAL PRINCIPLES

COGNITION AND EMOTION, 2013 11

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

YES/NO responses to a question that onlyinvestigated the moral appropriateness of a con-sequentialist resolution. They deduced the conflictbetween emotion and cognition in resolving themoral dilemma only according to the responsetimes of those people who chose, with largedeployment of cognitive resources and time, tosacrifice a person in order to save many other lives.But, by making the two dilemma resolutionsexplicit (deontological and consequentialist) inour study, a different pattern of cognitive�emo-tion interplay did emerge. In fact, our resultsshowed that there were no significant differencesin decisions times between consequentialist anddeontological resolutions when people are facedwith instrumental dilemmas. This result is some-what unexpected and contrary to Greene’s (2008)predictions. What we expected (and what Greeneet al., 2001, and Greene, Morelli, Lowenberg,Nystrom, & Cohen, 2008, found) was signifi-cantly slower response times for consequentialistresolutions because people are required to performan extremely affectively difficult action, and theyhave to override the intuitive emotional responsethat say ‘‘No!’’. However, although our data didtrend in this direction, no significant differencesin response times emerged between consequenti-alist and deontological resolutions. These resultsprovide contrary evidence for the hypothesisedasymmetry (Greene et al., 2008) between con-sequentialist and deontological judgements, withthe former driven by controlled cognitive pro-cesses and the latter driven by more automaticprocesses. On the other hand, our results are inline with McGuire et al’s (2009) study, in which itwas shown that the interaction between dilemmatype and response in the subject analysis ofGreene et al. (2001) was due to some specificpersonal dilemmas. In other words, once thosedilemmas were removed, personal dilemmasshowed the same pattern as impersonal dilemmas,with no difference between appropriated and notappropriated responses. As stated by McGuireand colleagues (2009), rather than longer RTs forresponses of appropriate to personal dilemmas, itwas the extremely fast responses of inappropriate

for a small set of personal dilemmas that producedthe interaction.

Further evidence against Greene’s (2008) pre-dictions also came from the pattern of resultsobtained in incidental dilemmas. As in such casesthere is no emotional response to override, Greene(2008) had hypothesised no difference in responsetimes between deontological and consequentialistchoices. By contrast, our results showed partici-pants’ slower response times in choosing thedeontological resolution. Furthermore, we founda significant difference in response times betweenincidental and instrumental dilemmas when thedeontological resolution was chosen. In fact,people were slower when deciding in incidentalthan instrumental dilemmas. Taken together,these results could suggest that, contrary toGreene (2008) and Greene et al.’s (2008) predic-tions, controlled reasoning is required to accountalso for deontological judgements, specifically incircumstances where it is possible to apply theDDE, as in the incidental dilemmas.

To explain these findings, it is relevant to high-light the differences between consequentialist anddeontological perspectives. For a consequentialistwhat counts is the greatest happiness for thegreatest number, and whether some consequencesare better than others does not depend on the wayin which these consequences are achieved. In fact,no significant differences in response times werefound between incidental and instrumental dilem-mas for consequentialist resolutions, possibly be-cause the distinction between intended and merelyforeseen consequences has no moral significancefor the consequentialism. Instead, for a deontolo-gical account of morality, agents cannot makecertain wrongful choices. Roughly speaking, deon-tologists hold that it is our intended ends andintended means that define our agency. Suchintentions mark out what it is set out to achievethrough our actions. If we intended something badas an end, or even as a means to some morebeneficent end, we are said to have set ourselves atevil (Aquinas, 1265�1272/1947), something weare categorically forbidden to do. But the DDEprovides a ‘‘special permission’’ for incidentallycausing death for the sake of good end. This might

MANFRINATI ET AL.

12 COGNITION AND EMOTION, 2013

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

suggest that for people that continue to have astrong deontological attitude also when faced withthe incidental dilemmas, the special permission forincidentally causing death provided by the DDE isprobably taken into account but it is believed, allthings considered, a wrongful choice. For thisreason, we suggest that for those people whoconsider the incidental death a wrongful and aforbidden choice that can’t be justified by its effect,considering (but not accepting) the special permis-sion provided by the DDE slows the choice for thedeontological resolution of the dilemma. One ofthe principal characteristics of the deontologicalperspective is the idea that morality is agent-centred and that intention constitute the morallyrelevant agency of people. For a deontologist, toinvoke double effect is to make a comparativejudgement: it is to assert that a harm that mightpermissibly be brought about as a side effect inpromoting a good end, could not permissibly bebrought about as a means to the same good end. Inother words, we propose that this comparativejudgement has a cognitive cost, and that theslowdown in the response times when participantschoose a deontological resolution in the incidentaldilemmas is the result of an interfering effect of thespecial permission, provided by the DDE, but notaccepted, for incidentally causing death for thesake of good end. For this reason, we believe thatthe DDE concerns only the deontological per-spective, and that an engagement of cognitiveprocesses is plausible also for non-consequentialistmoral deliberation. Although further studies areneeded to substantiate this finding, we believe thatour results may contribute to a more comprehen-sive understanding of the mechanisms involved inmoral judgement.

It is worth pointing out that a differentrelationship between moral judgement and inten-tionality could be considered. In this paper, we arecommitted to the DDE that is supposed to showthat there is a moral difference between effectsthat are brought intentionally and those that aremerely foreseen. Actions often have consequencesthat draw forth moral judgement, and whether anaction is judged intentional or not influences thatmoral judgement. However, in a recent study

Knobe (2003a) has shown that such connectioncan also run in the opposite direction, and thatpeople’s moral judgement can affect their intui-tions as to whether or not an action or a behaviourwas performed intentionally (Knobe, 2003a,2003b, 2006; Pettit & Knobe, 2009). Thisphenomenon has come to be known as the‘‘Knobe effect’’: people determine intentionalitybased on the moral consideration of whether aside effect is good or bad. The Knobe effectrepresents an obvious challenge for the DDE. Infact, the judgement of moral permissibility of anaction doesn’t seem the output of the doctrine,and the DDE seems to generate a judgement ofmoral permissibility only because of a priorassessment of the moral acceptability of theaction. What appears to be controversial is ourprocess of attributing intentions. According to theDDE, an action is permissible if the bad sideeffects are foreseen but not intended. Accordingto the Knobe effect, a foreseen side effect isjudged to be unintended if the action is judged tobe permissible. Therefore, it seems that the DDEreflects the moral intuitions of people who believein DDE. It should be noted that there are manycontroversies about the Knobe effect. The mainquestion is in what sense we could say that anunintended side effect is intentional. For example,Guglielmo and Malle (2010) have shown thatpeople rarely see an unintended side effect asintentional when they have a chance to expresstheir interpretation of the events with multipledescriptions to choose from, and their results castserious doubt on the hypothesis that judgementsof intentionality are guided by moral considera-tions. Although it would be premature to regardthe Knobe effect as a refutation of the DDE,further studies will be required to settle thiscontroversy.

Some limitations of the present study areworth mentioning. First, by measuring the‘‘core’’ affective feelings we have chosen to focuson the basic conscious experience that can bedescribed by the two psychological properties ofhedonic valence and arousal. Thus, we acknowl-edge that this approach does not fully account forall the various components of emotion and the

MORAL DILEMMAS AND MORAL PRINCIPLES

COGNITION AND EMOTION, 2013 13

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

complexity of the phenomenon. Second, in ourtask participants were instructed to report howthey actually felt when they were deciding, i.e.,before the behavioural choice between the deon-tological and the consequentialist resolution wasmade. Therefore, we cannot disentangle whatprocess(es) developing during decision makingthe affective ratings have been referred to by theparticipants. During the different stages of deci-sion making, emotion might be caused by theconflict in choosing between the two undesirableresolutions, by a differential assessment of theavailable resolutions, or by the formation ofpreference and the selection of one of the tworesolutions. It is also possible that the reportedemotional experience reflected the global affectivefeeling emerging from the whole process. How-ever, despite these limitations, we believe that ourdata might contribute to provide useful informa-tion on the role of emotional processes in moraljudgement. Our paradigm might also be appliedto better characterise moral judgement in patientswith ventromedial prefrontal lesions and, in turn,to shed light on the possible causal role ofemotion. When presented with Greene et al.’s(2001, 2004) personal dilemmas, these patientshave been found to provide a higher number ofutilitarian judgements than healthy controls, sug-gesting that emotional processing depending onthe integrity of the ventromedial prefrontal cortexis necessary for deontological resolutions to beprovided (Ciaramelli, Muccioli, Ladavas, & diPellegrino, 2007; Koenigs et al., 2007). In thelight of our findings, using a paradigm allowingthe assessment of response choices, affectiveratings, and response times as a function of thetwo types of resolutions, would make it possible todirectly test the differential effects of neuralemotional impairment on deontological and con-sequentialist resolutions.

In conclusion, our results support the view thatcognitive and emotional processes participate inboth deontological and consequentialist moraljudgements. More importantly, our results suggestthat, contrary to Greene (2008) and Greene et al.’s(2008) predictions, controlled reasoning is re-quired to account not only for consequentialist

judgements, but also for deontological judge-

ments, specifically in circumstances where it is

possible to apply the DDE, as in the incidental

dilemmas. Indeed, as stated by Cushman et al.

(2010), it no longer makes sense to engage in

debate over whether moral judgement is accom-

plished exclusively by reason as opposed to

emotion. Rather, moral judgement is the product

of complex interactions between emotional and

cognitive mechanisms.

Manuscript received 4 October 2011

Revised manuscript received 22 February 2013

Manuscript accepted 10 March 2013

First published online 23 April 2013

REFERENCES

Aquinas, T. (1947). Summa theologiae. New York:

Benzinger Brothers (Originally published in 1265�1272).

Baron, J. (1994). Nonconsequentialist decision. Beha-

vioral and Brain Sciences, 17, 1�10. doi:10.1017/

S0140525X0003301XBorg, J. S., Hynes, C., Van Horn, J., Grafton, S., &

Sinnott-Armstrong, W. (2006). Consequences, ac-

tion, and intention as factors in moral judgments:

An fMRI investigation. Journal of Cognitive

Neuroscience, 18, 803�817. doi:10.1162/jocn.2006.

18.5.803Bradley, M. M., & Lang, P. J. (1994). Measuring

emotion: The self-assessment manikin and the

semantic differential. Journal of Behavior Therapy

and Experimental Psychiatry, 25, 49�59. doi:10.

1016/0005-7916(94)90063-9Bradley, M. M., & Lang, P. J. (2000). Measuring

emotion: Behavior, feeling and physiology. In R.

Lane & L. Nadel (Eds.), Cognitive Neuroscience of

Emotion (pp. 242�276). New York: Oxford Uni-

versity Press.Brink, D. O. (1994). Moral conflict and its structure.

The Philosophical Review, 103, 215�247. doi:10.

2307/2185737Ciaramelli, E., Muccioli, M., Ladavas, E., & di

Pellegrino, G. (2007). Selective deficit in personal

moral judgment following damage to ventromedial

prefrontal cortex. Social Cognitive and Affective

Neuroscience, 2, 84�92. doi:10.1093/scan/nsm001

MANFRINATI ET AL.

14 COGNITION AND EMOTION, 2013

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

Cushman, F., Young, L., & Greene, J. D. (2010). Ourmulti-system moral psychology: Towards a consen-sus view. In J. Doris, G. Harman, S. Nichols, J.Prinz, W. Sinnott-Armstrong, & S. Stich (Eds.),The Oxford handbook of moral psychology (pp. 47�71).Oxford: Oxford University Press.

Cushman, F., Young, L., & Hauser, M. (2006). Therole of conscious reasoning and intuition in moraljudgment: Testing three principles of harm. Psycho-

logical Science, 17, 1082�1089. doi:10.1111/j.1467-9280.2006.01834.x

Damasio, A. (1994). Descartes’ error. Boston, MA:Norton.

Dean, R. (2010). Does neuroscience undermine deon-tological theory? Neuroethics, 3, 43�60. doi:10.1007/s12152-009-9052-x

Duncan, S., & Barrett, L. F. (2007). Affect as a form ofcognition: A neurobiological analysis. Cognition and

Emotion, 21, 1184�1211. doi:10.1080/02699930701437931

Foot, P. (1967). The problem of abortion and thedoctrine of the double effect. Oxford Review, 5.Oxford: Oxford University Press.

Greene, J. D. (2008). The secret joke of Kant’s soul. InW. Sinnot-Armstrong (Ed.), Moral psychology. Vol.

3: The neuroscience of morality: Emotion, brain

disorders, and development (pp. 35�79). Cambridge,MA: MIT Press.

Greene, J. D., Cushman, F. A., Stewart, L. E.,Lowenberg, K., Nystrom, L. E., & Cohen, J. D.(2009). Pushing moral buttons: The interactionbetween personal force and intention in moraljudgment. Cognition, 111, 364�371. doi:10.1016/j.cognition.2009.02.001

Greene, J. D., Nystrom, L. E., Engell, A. D., Darley,J. M., & Cohen, J. D. (2004). The neural bases ofcognitive conflict and control in moral judgment.Neuron, 44, 389�400. doi:10.1016/j.neuron.2004.09.027

Greene, J. D., Sommerville, R. B., Nystrom, L. E.,Darley, J. M., & Cohen, J. D. (2001). An fMRIinvestigation of emotional engagement in moraljudgment. Science, 293, 2105�2108. doi:10.1126/science.1062872

Greene, J. D., Morelli, S. A., Lowenberg, K., Nystrom,L. E., & Cohen, J. D. (2008). Cognitive loadselectively interferes with utilitarian moral judg-ment. Cognition, 107, 1144�1154. doi:10.1016/j.cognition.2007.11.004

Guglielmo, S., & Malle, B. F. (2010). Can unintendedside effects be intentional? Resolving a controversy

over intentionality and morality. Personality and

Social Psychology Bulletin, 36, 1635�1647. doi:10.1177/0146167210386733

Haidt, J. (2001). The emotional dog and its rationaltail: A social intuitionist approach to moral judg-ment. Psychological Review, 108, 813�834. doi:10.1037/0033-295X.108.4.814

Hauser, M., Cushman, F., Young, L., Jin, R. K., &Mikhail, J. (2007). A dissociation between moraljudgments and justification. Mind & Language, 22,1�21. doi:10.1111/j.1468-0017.2006.00297.x

Kant, I. (1959). Grundlegung zur Metaphysik der Sitten

[Foundation of the metaphysics of morals] (EnglishTrans.). Indianapolis, IN: Bobbs-Merril. (Originallypublished in 1785)

Knobe, J. (2003a). Intentional action and side effects inordinary language. Analysis, 63, 190�194. doi:10.1093/analys/63.3.190

Knobe, J. (2003b). Intentional action in folk psychol-ogy: An experimental investigation. Philosophical

Psychology, 16, 309�324. doi:10.1080/09515080307771

Knobe, J. (2006). The concept of intentional action: Acase study in the uses of folk psychology. Philoso-

phical Studies, 130, 203�231. doi:10.1007/s11098-004-4510-0

Koenigs, M., Young, L., Adolphs, R., Tranel, D.,Cushman, F., Hauser, M., & Damasio, A. (2007).Damage to the prefrontal cortex increases utilitarianmoral judgements. Nature, 446, 908�911.doi:10.1038/nature05631

Lang, P. J. (1980). Behavioral treatment and bio-behavioral assessment: Computer applications. InJ. B. Sidowski, J. H. Johnson, & T. A. Williams(Eds.), Technology in mental health care delivery

systems (pp. 119�137). Norwood, NJ: Ablex.Lang, P. J., Greenwald, M. K., Bradley, M. M., &

Hamm, A. O. (1993). Looking at pictures: Affec-tive, facial, visceral, and behavioral reactions. Psy-

chophysiology, 30, 261�273. doi:10.1111/j.1469-8986.1993.tb03352.x

Macintyre, A. (1990). Moral dilemmas. Philosophy and

Phenomenological Research, 50, 367�382. doi:10.2307/2108048

McGuire, J., Langdon, R., Coltheart, M., & Mack-enzie, C. (2009). A reanalysis of the personal/impersonal distinction in moral psychology research.Journal of Experimental Social Psychology, 45, 577�580. doi:10.1016/j.jesp.2009.01.002

Mikhail, J. (2002). Aspects of the theory of moral cognition:

Investigating intuitive knowledge of the prohibition of

MORAL DILEMMAS AND MORAL PRINCIPLES

COGNITION AND EMOTION, 2013 15

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13

intentional battery and the principle of double effect

(Georgetown University Law Center Public Law &

Legal Theory Working Paper No. 762385). Re-

trieved from: http://ssrn.com/abstract�762385Moore, A. B., Clark, B. A., & Kane, M. J. (2008). Who

shalt not kill? Individual differences in working

memory capacity, executive control, and moral

judgment. Psychological Science, 19, 549�557.

doi:10.1111/j.1467-9280.2008.02122.xNichols, S., & Mallon, R. (2006). Moral dilemmas and

moral rules. Cognition, 100, 530�542. doi:10.1016/

j.cognition.2005.07.005Pettit, D., & Knobe, J. (2009). The pervasive impact of

moral judgment. Mind & Language, 24, 586�604.

doi:10.1111/j.1468-0017.2009.01375.xPosner, J., Russell, J. A., & Peterson, B. S. (2005). The

circumplex model of affect: An integrative approach

to affective neuroscience, cognitive development,

and psychopathology. Development and Psychopathol-

ogy, 17, 715�734. doi:10.1017/S0954579405050340Russell, J. A. (1980). A circumplex model of affect.

Journal of Personality and Social Psychology, 39, 1161�1178. doi:10.1037/h0077714

Russell, J. A. (2003). Core affect and the psychologicalconstruction of emotion. Psychological Review, 110,145�172. doi:10.1037/0033-295X.110.1.145

Russell, J. A., Weiss, A., & Mendelsohn, G. A. (1989).Affect grid: A single-item scale of pleasure andarousal. Journal of Personality and Social Psychology,57, 493�502. doi:10.1037/0022-3514.57.3.493

Scanlon, T. M. (2008). Moral dimensions: Permissibility,

meaning, blame. Cambridge: Basic Books.Schweder, D., & Haidt, J. (1993). The future of moral

psychology: Truth, intuition, and the pluralist way.Psychological Science, 4, 360�365. doi:10.1111/j.1467-9280.1993.tb00582.x

Sinnott-Armstrong, W., Mallon, R., McCoy, T., &Hull, J. G. (2008). Intention, temporal order, andmoral judgments. Mind & Language, 23, 90�106.doi:10.1111/j.1468-0017.2007.00330.x

Thomson, J. J. (1986). Rights, restitution, and risk:

Essays in moral theory (W. Parent Ed.). Cambridge,MA: Harvard University Press.

Waldmann, M. R., & Dieterich, J. H. (2007). Throw-ing a bomb on a person versus throwing a person ona bomb: Intervention myopia in moral intuitions.Psychological Science, 18, 247�253.

MANFRINATI ET AL.

16 COGNITION AND EMOTION, 2013

Dow

nloa

ded

by [

VU

L V

ande

rbilt

Uni

vers

ity]

at 0

1:41

29

Apr

il 20

13