a neuroskeptic's guide to neuroethics and national security

Upload: aaron-wilkerson

Post on 04-Jun-2018

217 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    1/14

    AJOB Neuroscience, 1(2): 412, 2010

    Copyright c Taylor & Francis Group, LLC

    ISSN: 2150-7740 print / 2150-7759 online

    DOI: 10.1080/21507741003699256

    Target Article

    A Neuroskeptics Guide to Neuroethics

    and National SecurityJonathan H. Marks,The Pennsylvania State University and Harvard University

    This articleinformed by science studies scholarship and consonant with the emerging enterprise of critical neurosciencecritiques recent neuroscience research,

    and its current and potential applications in the national security context. The author expresses concern about the subtle interplay between the national security and

    neuroscience communities, and the hazards of the mutual enchantmentthat may ensue. The Bush Administrations war on terror has provided numerous examples of

    the abuse of medicine,behavioral psychology, polygraphy, andsatelliteimagery. Thedefenseand national security communities havean ongoing interestin neuroscience

    tooin particular, neuroimaging and psychoactive drugs (including oxytocin) as aids to interrogation. Given the seductive allure of neuroscientific explanations and

    colorful brain images, neuroscience in a national security context is particularly vulnerable to abuse. The author calls for an urgent reevaluation of national security

    neuroscience as part of a broader public discussion about neurosciences nontherapeutic goals.

    Keywords: critical neuroscience, fMRI, national security, neuroimaging, neuroskepticism, oxytocin, psychoactive drugs

    It is hard to imagine anywhere darker, more esoteric, andto be frankmore thrilling than the domain of nationalsecurity neuroscience. In this article, I explore that intrigu-ing place where neuroscience and national security inter-sect, each enchanting to the initiates of the other, and bothsomewhat mysterious to the rest of us. I confess that myaim here is to puncture that aura of mystery and enchant-ment, to defuse the understandable thrill, and to offer somewords of cautionin particular to scientists, ethicists, re-search funding bodies, policymakers, and anyone else whomay play a significant role in shaping the kinds of neuro-science research that will be conducted in the years ahead.Before proceeding, however, I shouldmake two things clear.

    First, I readilyacknowledge that neuroscience offers un-paralleled opportunities to transform our lives, and (forsome) it has already done so. Few of these opportunitiesare more dramatic than the potential use of functional mag-netic resonance imaging (fMRI) to identify patients withimpaired consciousness who might be candidates for reha-bilitation (Owen and Coleman 2008), and of deep brain stim-ulation to release them from imprisonmentin hitherto unre-sponsive bodies (Schiff et al. 2007). However, my thesis hereis premised on what might be calledneuroskepticismthatis, a perspective informed by science studies scholarshipthat views with some healthy skepticism claims about thepractical implications and real-world applications of recent

    developments in neuroscience. Theneed to probe and ques-tion is, I contend, especially acute in the context of national

    This article is based on a plenary lecture of the same title delivered at the Novel Tech Ethics Conference in Halifax, Novia Scotia, inSeptember 2009. The author is indebted to the conference organizersin particular, Jocelyn Downie and Francoise Baylisfor extendingthis invitation and providing him with the opportunity to develop his views.Address correspondence to Jonathan H. Marks, Edmond J. Safra Faculty Fellow in Ethics, Harvard University, 79 JFK Street, Taubman,Cambridge, MA 02139, USA. E-mail: Jonathan [email protected]

    security neurosciencewhere the translation from researchlaboratory to real life may involve great leaps, among themthe troubling jump from brain scanning to terrorist screen-ing.

    The approach I adopt here is consonant with andsympathetic to the goals of critical neuroscienceamultidisciplinary project recently defined as a reflexivescientific practice that responds to the social and culturalchallenges posed both to the field of science and to societyin general by recent advances in the behavioural andbrain sciences (Choudhury et al. 2009). The proponents ofcritical neuroscience aim to bridge the gap between sciencestudies and empirical neuroscience by engaging scholarsand practitioners from the social sciences, humanities, andempirical neuroscience to explore neglected issues: amongthem, the economic and political drivers of neuroscienceresearch, the limitations of the methodological approachesemployed in neuroscience, and the manner in whichfindings are disseminated. The projects avowed andworthy goals include maintaining good neuroscience,improving representations of neuroscience, and . . . creatingan awareness of its social and historical context in order toassess its implications (Choudhury et al. 2009, 66).

    Second, I acknowledge the legitimate aims and objec-tives of the national security enterprise and of the officialssolemnly charged with its pursuit. However, sometimes na-

    tional security threats may be overstated or invoked forpolitical ends, and the means employed in the pursuit of

    4 ajob Neuroscience

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    2/14

    Neuroethics and National Security

    these objectives are often fundamentally violative of thehuman rights of others (for a more detailed explication, seeMarks 2006). In addition, as I outline later, there are manyexamples from the Bush Administrations war on terror ofmedicine, other health sciences (including behavioral psy-chology), and polygraphy being abused in the name of na-tional security. So, while there are risks that the national se-curity community may be misled about what neurosciencecan offer, I am also concerned about the ways in which na-tional security may pervert neuroscience.

    NEUROSCIENCE NARRATIVES AND SECURITY

    SEMANTICS

    Neuroscience and national security both jealously guardtheir own argot. In the case of neuroscience, the lexicon isreplete with Latin and Greek and innumerable portman-teau constructions that fuse (or confuse) both classical lan-guages. Consider, for example, the subthalamic nucleus, aneuroanatomical term that sandwiches a Greek derivativebetween two Latin ones, and is (infelicitously) susceptibleto the translation the nut under the bedroom. For somecognoscenti, there may be a familiar poetry in the classicallanguage of the brains anatomythesulciandgyribecom-ing, perhaps, the neuro-topographical analogs of WilliamWordsworths vales and hills. But those who do not pos-sess either training in neuroscience or an anatomical dictio-nary are lost.

    National security too has its special (albeit less colorful)language, consistingfor the greater partof cryptic andsomewhat intimidating initials, acronyms, and not-quite-acronyms, such as HUMINT (humanintelligence)and BSCT(standing for behavioral science consultation team, and pro-nounced biscuit.) For readers unfamiliar with both theseterms, the former is commonly defined as a category ofintelligence derived from information collected and pro-vided by human sources. This includes interrogations (aswell as other forms of overt or clandestine conversationswith sources that may be considered neutral, friendly,or hostile). BSCTs are teams of psychologists and/or psy-chiatrists, and their assistants (often mental health techni-cians), tasked by the Department of Defense with advis-ing interrogators how to ramp up interrogation stressorsat Guantanamo Bay, Abu Ghraib, and elsewhere (see, e.g.,Mayer 2008).

    My intention here is not to take an easy shot at theneuroscience and national security communities, and thelinguistic practices in each of these domains. Rather, I wishto express concern about the naming of things in these con-texts and, in particular, about the hazardspractical andethicalthat arise from the deployment of opaque termi-nology. It is easy for outsiders to the world of neuroscienceto believe that, because there is a polysyllabic name forsomepart of our brain, we have a deep understanding of what itdoes and how it does it. This is, of course, not necessarily thecase. To give just one example, physicians can sometimesachieve dramatic improvements in the motor symptoms ofsome Parkinsons patients by using small electrical pulses

    to stimulate the subthalamic nucleus (the nut mentionedearlier). However, the mechanism by which this effect isachieved is still being explored. In addition, as one neuro-surgeon colleague recently made clear to me, our stimula-tors are not smart. They do not monitor what is occurringin the subthalamic nucleus and respond to it. Nor do theymonitor the response of the nucleus to their stimuli.

    Some might argue that, in this clinical example, the in-tervention works, and while we should seek to refine thetechnique and our understanding of the efficacy of the in-tervention, the limited nature of our current understandingshould not bar its use. I do not intend to address that claimhere. However, the nonclinical application of neurosciencein the murkier national security context creates serious po-tential hazards, and these risks are amplified in the absenceof solid theoretical models and robust empirical data. Notleast, there is a real danger that pseudo-neuroscience willbecome a vehicle for the abuse of those who are perceivedas a threat to national security.

    Although I will substantiate this point shortly, allowme briefly to explore the foundations of neurosciences lin-guistic hazards. When the language of neuroscience is usedto construct explanatory narratives, the results can be un-duly persuasive due to a phenomenon the philosopher J. D.Trout has termed explanatory neurophilia (Trout 2008).A recent study indicates that nonexpertsincluding col-lege students taking a cognitive neuroscience classare notvery good at critiquing neuroscience narratives. DeborahWeisberg and colleagues explored the hypothesis that evenirrelevant neuroscience information in an explanation of apsychological phenomenon may interfere with a personsability to consider critically the underlying logic of that ex-planation (Weisberg et al. 2008). Weisberg found that non-experts judged explanations with logically irrelevant neu-roscience information to be more satisfying, particularly inthe case of bad explanations. The authors try to explain theresults in a number of ways. They suggest that theseductivedetails effectmay be in play. According to this theory, seduc-tive details that are relatedbut logically irrelevantmakeit more difficult for subjects to code and later recall themain argument of a text. They also hypothesize that lowerlevel explanations, in particular, may make bad explana-tions seem connected to a larger explanatory system andtherefore more insightful.

    Contemplating the implications of the Weisberg study,J. D. Trout has argued that placebic neuroscientific informa-tion may promote the feeling of intellectual fluency andthat all too often humans interpret the positive hedonicexperience of fluency as a mark of genuine understanding

    (Trout 2008). Trout suggests that neurophilic fluency flour-ishes wherever heuristics in psychology are reductionist,that is, where they focus on a small number of local fac-tors with apparent causal significance in order to explain acomplex problem. Although more work may be required toprovide a full account of explanatory neurophiliathat is,our blind (or at thevery least blinkered) lovefor neuroscien-tific explanationsthere is littledoubt thatthe phenomenonhas been persuasively demonstrated.

    AprilJune, Volume 1, Number 2, 2010 ajob Neuroscience 5

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    3/14

    AJOB Neuroscience

    In the national security domain, there is also a temp-tation to believe that a claim is true because it carries thelabel HUMINT, or is similarly packaged in the specialistlanguage of national security. Many senior administrationofficials appear to have believed (or wanted to believe) thatEITsso-called enhanced interrogation techniqueswere, as the name suggested, enhanced. But, as experi-enced interrogators have repeatedly asserted, the productsof aggressive interrogation tactics such as waterboarding,exposure to temperature extremes, stress positions, and thelike tend not to be reliable, whatever one calls them (see,for example, Bennett 2008; Soufan 2009). This is because in-terrogatees under pressure tend to say whatever it is theybelieve their captors want to hear, or anything just to stoptheir abuse. Numerous detainees in the Bush Administra-tions war on terror retracted claims they had made dur-ing torturous interrogations, once they were removed fromtheir high-pressure interrogation environmentsmost no-tably, Khalid Sheikh Mohammed, who was waterboarded183 times in March 2003 (CIA Inspector General 2004), andlater told the Red Cross:

    During the harshest period of my interrogation I gave a lotof false information in order to satisfy what I believed theinterrogators wished to hear in order to make the ill-treatmentstop. . . . Im sure that the false information I was forced toinvent in order to make the ill-treatment stop wasted a lot oftheir time. (ICRC 2007)

    Psychologists James Mitchell and Bruce Jessen were theprincipal architects of the post-9/11 interrogation regime towhich Khalid Sheik Mohammed and others were exposed(Mayer 2008; SASC 2008). But as Scott Shane observed intheNew York Times:

    [Mitchell and Jessen] had never carried out a real interroga-tion, only mock sessions in the military training they hadover-seen. They had no relevant scholarship; their Ph.D. disserta-tions were on high blood pressure and family therapy. Theyhad no language skills and no expertise on Al Qaeda.

    But they had psychology credentials and an intimate knowl-edge of a brutal treatment regimen used decades ago by Chi-nese Communists. For an administration eager to get toughon those who had killed 3,000 Americans, that was enough.(Shane 2009)

    Mitchell and Jessen drew on their experience of the SEREtraining programa program designed to inoculate U.S.service personnel against abusive treatment at the hands ofenemy captors by exposing them to the kinds of treatmentthat they had historically received, for example, during the

    Korean War. Mitchell and Jessen reverse-engineered thosetechniques and used them as the basis for a new aggres-sive interrogation regime in the war on terror (Mayer2008). But the reverse engineering of SERE tactics not onlyviolated fundamental human rights norms, and the base-line protections for detainees found in Common ArticleIII of the Geneva Conventions (Marks 2007a); it was alsopremised on a fundamental strategic error. As several expe-

    rienced interrogators have repeatedly made clear (see, forexample, Bennett 2008; Soufan 2009) and as some psychol-ogists tried to warn the Bush Administration (Fink 2009),these techniques are not reliable methods for the extrac-tion of intelligence. On the contrary, as the North Koreansdemonstrated in the 1950s (Margulies, 2006) and the Britishgovernment discovered in the wake of several questionableconvictions for terrorism in the 1970s (Gudjonsson 2003),these techniques tend to be excellent methods for extract-ing sham confessions and getting detainees to say whateverthey believe their captors want to hear. While this was theintended effect in the former case, in the latter it wasnot.Asa result, several Irish Republican Army (IRA) suspects werefalsely convicted, while those responsible for the mainlandterror attacks in Britain continued to roam free. However,this vital element was lost in the translation of stress tacticsfrom the SERE training program to the U.S. detention andinterrogation operations.

    In my view, this account demonstrates the perils arisingfrom a lack of critical engagement with purported scientificexpertise, in this case behavioral psychology, in a nationalsecurity context. These perils arise, in part, from the seduc-tive nature of national security terminologysuch as en-hanced interrogation techniquesthat exaggerates or mis-represents the scientific foundations of a particular practice.Two related examples from the interrogation context aretruth serum and its even more troublesome cousin, liedetector. These terms reflect a profound lack of precisionand tend to reinforce the operation of mental heuristics thatdeprive us of the opportunity to think critically.

    The U.S. military and intelligence communities havelong had a fascination for psychoactive drugs as interro-gation aids (see, for example, ISB 2006, 7374). The termtruth serum is loosely used to describe a variety of psy-choactive drugs including scopolamine, sodium pentothal,and sodium amytal. The colloquialism seems to promisethe Holy Graildetainees in a drug-induced state of com-pliance, unable to resist imparting explosive nuggets of ac-tionable intelligence. However, there is, to date, no drugthat can live up to this title. These drugs may make somepeople more talkative, but there is no guarantee that whatthey say is either accurate or useful. It appears that thisdid not prevent the reported use of psychoactive drugsas interrogation aids in the war on terror. Several de-tainees have claimed that they were drugged prior to inter-rogation (Warrick 2008). The CIA and Defense Departmentdispute these claims. However, the Bush Administrationcommissioned and received legal opinions that took a per-missive approach to the use of drugs in interrogation.1 The

    CIA has acknowledged that detainee Abu Zubaydah was

    1. See, for example, Jay Bybees Memorandum for AlbertoGonzales, dated August 1, 2002, available at http://www.washingtonpost.com/wp-srv/politics/documents/cheney/torture memo aug2002.pdf (last accessed January 4, 2010) and

    John Yoos Memorandum for William Haynes of March 14, 2003at http://www.aclu.org/files/pdfs/safefree/yoo army torturememo.pdf (last accessed January 4, 2010).

    6 ajob Neuroscience AprilJune, Volume 1, Number 2, 2010

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    4/14

    Neuroethics and National Security

    waterboarded 83 times in August 2002 (CIA Inspector Gen-eral 2004). It has also been reported that Zubaydah wasdrugged with sodiumpentothal(see, for example, Follmann2003). If this allegation is true, Abu Zubaydahs abusive in-terrogation may speak as much to the efficacy of so-calledtruth serums as to the utility of waterboarding.

    Not surprisingly, the search for the Holy Grail contin-ues, and neuroscience cannot resist stepping up to the plate.There has been much discussion in both academic journalsand the media recently about oxytocin. This hormone is re-leased in the bodies of pregnant women during labor (see,for example, Lee et al. 2009) and there is some evidence tosuggest that intranasal administration of oxytocin causesa substantial increase in trusting behaviour (Kosfeld et al.2005;see also Baumgartner et al. 2008). As a recent report oftheNational Research Council acknowledged, thedrug is ofparticular interest to the defense and national security com-munities, not simply because of its implications for soldierperformance, but also because it might allow for new in-sights into adversary response (NRC 2009). Although thereportdoes notexpresslydiscuss this, onepotential useis itsadministration as an aid to interrogationperhaps covertlyprior to interrogation in aerosolized form. This may soundfanciful. However, aerosolized oxytocin is already beingmarketed by one corporation, Verolabs, as Liquid Trust,promising to deliver the world at your fingertips, whetheryou are single, in sales, an unhappy employee who wantsto get ahead2or perhaps all three of these. So we shouldexpect interrogators to be tempted to use it if they have notalready done so.

    If there were such a thing as a truth serum, of course,there would be little need to direct intelligence efforts to-ward the detection of lies. But that too is an enterprise witha long and colorful historydiscussed in more detail thanis possible here in a 2003 report of the National ResearchCouncil (NRC 2003). For the greater part of the last cen-tury, lie detector was the monicker associated with thepolygraph, although the device does nothing of the kindsuggested by the term. The polygraph does not detect lies;on the contrary, it only measures physiological changes thattend to be associated with anxiety. This is problematic be-cause for many polygraph subjects the experience of beingtested itself is sufficient to cause considerable anxiety. Asa result, the NRC concluded, the polygraph is intrinsi-cally susceptible to producing erroneous resultsin par-ticular, false positives (NRC 2003, 2). In addition, counter-measures are possible: People can be trained to beat thepolygraph by reducing external manifestations of anxiety,creating false negatives. In spite of these limitations, the lie

    detector label has stuck, reinforced by countless televisionseries and moviesand figurative labels, like their literalcounterparts, are often hard to peel away.

    There is strong evidence that polygraphy was abusedin the war on terror and, in my view, this misuse is

    2. See http://www.verolabs.com/news.asp(last visited September20, 2009).

    attributable to misunderstandings of the technology rein-forced by the lie detector monicker. Documents obtainedby M.Gregg Bloche and me pursuant to Freedom of Infor-mation Act (FOIA) requests reveal that between August2004 and October 2006, the U.S. Air Force Polygraph Pro-gram conducted 768 polygraphs in Iraq.3 According to aninternal summary, 47% of the polygraph tests indicated nodeception, while 46% purported to indicate some form ofdeception. This was interpreted by the drafter of the sum-mary in the following way: Detainee personnel are justas likely to have committed the suspected act as not. (Ofcourse, this might equally have been interpreted to mean:Detainee personnel are just as likely not to have committedthe suspected act.) But further reading suggests that the at-tribution of guiltdefined as involvement in multiple actsof anti-coalition force activitieson the basis of these re-sults requires an unjustifiable leap of faith. The summaryitself offers one important reason why the reports conclu-sion is unwarranted. It states (without acknowledging theimplications of this) that only 10% of requests for poly-graph support contain sufficiently detailed information forspecific issue exams. The remainder of the polygraph testswere (according to the summary) by definition screeningexaminations wherein the examiner is called to resolve nu-merous and divergent issues based on extremely generic,anonymous and perishable reporting.

    In polygraphers feedback forms accompanying thissummary, many respondents complained that the poly-graph technology was either over utilized or not utilizedproperly. Two described the use of a failed polygraph testas a hammer to be used against the detainee. One saidthis never resulted in anything positive, while another saidthat, having participated in 240 polygraph examinations inIraq, on only one occasion did he witness this approachproduce anything of value. Despite this, detainees wereregularly hammered with polygraph resultseven whendeception was not indicated (that is, even when they hadpassed the polygraph test). In such cases, the only clearevidence of dishonesty was on the part of the interrogators.

    Not surprisingly, many polygraphers complained aboutthe way their services had been deployed. One notedthat interrogators did not fully understand how to useour services despite multiple briefings and pretest coordi-nation discussions. The polygraph was often used as acrutch to avoid unnecessary interrogations, one polygra-pher claimed. Another complained about the use of whats/he considered to be worthless questions, and estimatedthat in 70% of cases interrogators asked: Have you everbeen involved in attacking coalition forces? Others de-

    scribed largerissuesthat themilitary failed to address. Mostnotably, one concluded:

    I encountered nothing butdifficulties with theexams and haveno reason to have any confidence the results were valid. I at-tribute these problems to a host of reasons: bad environment,

    3. Documents on file with author.

    AprilJune, Volume 1, Number 2, 2010 ajob Neuroscience 7

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    5/14

    AJOB Neuroscience

    problems with interpreters [who were used in most interroga-tions], and cultural differences.

    Even in its traditional use in the United States, it is clearthat the polygraph does not merit the moniker, lie detec-tor. (The National Research Council has noted that whilethe technology performs better than chance, it is far fromperfect [NRC 2003].) But in the national security context,

    where most interrogations are mediated through an inter-preter, and cultural issues are often ignored, the label issurely even more problematic.

    The language of lie detection is more worrisome stillwhenit is deployed to describe the use of brain imaging andrelated technologies that do not patently rely on externalmanifestations of anxiety. Newer technologies purport toshow us what is going on in someone elses brain andifone were to believe much of the press coveragein theirmind. As the British experimental psychologist Dr. RichardHenson hassuccinctly observed,There is a real dangerthatpictures of blobs on brains seduce one into thinking that wecan now directly observe psychological processes (Henson2005, 228). It is to the seductive power of brain images that

    I now turn.

    NEUROIMAGING AND NEUROSCIENTIFIC

    IMAGINERIES

    Brain images are ubiquitous. They are no longer the soleprovince of medical and scientific journals. Viewers of ca-ble news and print media alike are frequently shown brainimages. They usually accompany features breathlessly re-porting that brain imaging has heralded the end of lies orfinally lifted the shroud to reveal how the brain handleslove and pain.4 But functional neuroimages are not imagesinsofar as that word is used to connote optical counterpartsof an object produced by an optical device.5 Brain activitydoes not have an optical component. We cannot literally seepeople thinkalthough these images may suggest as muchto those with little or no understanding of how they areproduced. Rather, neuroimages are carefully constructedrepresentations of the brain and its functions. When theresults of fMRI-based cognitive neuroimaging studies arepresented to us in image form (as is almost invariably thecase), tiny changes in blood oxygenation levels (less than3%) are represented by bright colors (usually reds, yellows,and blues). These changes are the product of a comparisonbetween levels of blood oxygenation for a chosen cognitivetask and those for an activity considered a suitable baseline.These changes are interpreted as markers of local activationor inhibition in the regions of the brain in which they occur.

    Many science studies scholars and bioethicists have cri-tiqued the manner in which brain images are produced,constructed, and interpreted (see, for example, Dumit 2003;Joyce 2008; Wolpe et al. 2005; Marks 2007b). I do not review

    4. See, for example, http://www.msnbc.msn.com/id/4313263/ns/technology and science-science/ (last visited September 20,2009).5. I draw here from the definition of image at www.m-w.com.

    all these critiques here. Instead, I wish to highlight a sim-ple methodological point that is often not appreciated. Afunctional MRI imagethe kind so frequently reproducedin glossy magazines for lay readersis usually not a sin-gle image of one persons brain. There are two reasons forthis. First, the changes represented in brain images are oftennot those of a single experimental subject. More commonly,they are representations of composite data from a smallexperimental cohort. Second, these color images are super-imposed on a higher resolution structural brain image, justas Doppler weather radar images are superimposed on ge-ographic maps. The higher resolution image is intended toreveal the topography of the brainjust as the geographicmap (on to which the constructed Doppler image is super-imposed) is intended to show thatfor examplethe latesthurricane is 50 miles off the coast of Georgia. However, thestructural image of the brainneed not be taken (and is oftennot taken) from any of the subjects of the experiment. Thisis important because the neurological analog of the stateof Georgia in my brain is (like the lyrical Georgia on mymind) not necessarily the same as yours. Put another way,there is considerable variation in the anatomical structureof the human brainvariations that occur even as betweenidentical twins, and the right and left hemispheres of thesame brain (Weiss and Aldridge 2003). So the colored areaof activation or inhibition may not correlate precisely withthe area represented in the structural image.

    This reinforces two points recently made by the neu-roscientist and philosopher Adina Roskies. First, as Dr.Roskies notes, the conventions of the brain image are rep-resentational translations of certain nonvisual properties re-lated to neural activity in the brainthat is, the compara-tive magnetic properties of oxygenated and deoxygenatedhemoglobin; and second, the choices that are made [in theconstruction of a brain image] are not visible in or recover-able from the image itself (Roskies 2008). In my view, func-tional brain images might properly be considered the prod-uct of neuroscientific imaginaries comprising the values,beliefs, and practices of the neuroimaging community.6 Thisis not to say that functional neuroimages are complete fab-rications, entirely divorced from reality (in this case, brainfunctions). Rather, brain images arethe product of decisionsabout scanning parameters and the criteria for statisticalsignificance, in conjunction with acts of interpretation andrepresentation that may tell us as much about the imagersas they do about those who are imaged (although we mustlookbeyond theimages to learn muchabouteither of them).

    This analysis is important because it highlights thechasm between the manner in which brain images are con-

    structed and the way lay viewers in particular comprehendthem. Many of us appear to assume that visually arrestingbrain images share the evidential characteristics and epis-temic status of photographs (see Roskies 2008). Empiricalsupport for this view has been provided by David McCabeand Alan Castel, who have shown that readers attribute

    6. The term is my own, but it draws some inspiration from thenotion of technoscientific imaginaries. See Marcus (2005).

    8 ajob Neuroscience AprilJune, Volume 1, Number 2, 2010

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    6/14

    Neuroethics and National Security

    greater scientific value to articles summarizing cognitiveneuroscience research when those articles include brain im-ages than they do when the articles include no image, a bargraph, or a topographicalmap of the brain. They found thatthis effect (albeit not large) was demonstrable regardlessof whether the article included errors in reasoning (McCabeandCastel 2008).One explanation forthismay be,as Roskiescontends, that neuroimages are inferentially distant frombrain activity, yet they appear not to be. Put another way,she argues, brain images are seemingly revelatory. In myview, this latter claim needs to be probed a little further.To whom is what revealed, and by what means? The an-swer to this question (one I next endeavor to provide) isvital to a nuanced understanding of some of the potentialhazards of brain imaging in the arsenal of national securityneuroscience.

    In spite of their ubiquity, brain images are essentiallymeaningless to the uninitiated. When we look at these im-ages in isolation, there is no aha! moment, no epiphany.The images require the explanation of experts. However, atthe same time, the images also tend to reinforce the expertnarrative. It is hard not to be impressed by the expert (realor apparent) who can guide us through the imageswhocan bring us to the point of revelation. And if the expertis compelling enough, it can be hard to look at the imageagain without relying heavily on the experts interpretiveframework. In this way, brain images and neuroscientificnarratives rely on each other to work their persuasive andpervasive magic. This point may be illustrated by a notablerecent analog regarding the use of a different kind of imagein a national security context (the run-up to the invasion ofIraq in 2003). I have chosen this example not simply to il-lustrate how images and narratives work together, but alsoto demonstrate the potential hazards when science is pur-portedly deployed in a national security context.

    In February 2003, then U.S. Secretary of State ColinPowell made a presentation to the United Nations Secu-rity Council that was intended to substantiate the UnitedStates argument that Iraq presented an imminent threat toregional and global security.7 Most people recall Powellspresentationwhich he subsequently described as a per-manent blot on his record (Weisman 2005)because of atheatrical flourish: He held up a small vial of white pow-der while describing the threat that a similar quantity ofweaponized anthrax might present. Fewer may now recallsome of the other visual elements of his presentationinparticular, his reliance on IMINT(or imageintelligencetheterm used in nationalsecuritese to denoteaerialand satel-lite photography). Powell showed time-sequenced satellite

    images of buildings whose functions were described in yel-low textboxesamongthem a chemical munitions bunker.8

    7. The full text of Colin Powells presentation to the UnitedNations is available at http://www.cnn.com/2003/US/02/05/sprj.irq.powell.transcript/ (last accessed September 20, 2009).8. The images are available at http://www.state.gov/secretary/former/powell/remarks/2003/17300.htm (last accessed Septem-

    ber 20, 2009).

    He also showed computer-generated images of trucks andrailway carriages that were described as mobile produc-tion facilities for biological agents. Before displaying theimages, Powell warned that we could notunderstand them,but that imaging experts had shed light where, otherwise,there would only be darkness:

    The photosthatI amabout to showyou are sometimes hardforthe average person to interpret, hard for me. The painstakingwork of photo analysis takes experts with years and years ofexperience, poringfor hours and hours over light tables. ButasI show you these images, I will try to capture and explain whatthey mean, what they indicate, to our imagery specialists.

    As the filmmaker Errol Morris subsequently noted in a NewYork Timesblog:

    I dont know what these buildings were reallyused for. I dontknow whether they were used for chemical weapons at onetime, and then transformed into something relatively innocu-ous, in order to hide the reality of what was going on fromweapons inspectors. But Idoknow that the yellow captions in-

    fluencehow we see the pictures. Chemical MunitionsBunkeris different from Empty Warehouse which is different fromInternational House of Pancakes. The image remains thesame but weseeit differently.9

    The interpretations of these photographs offered so con-vincingly by Powell have, of course, failed to stand up toscrutiny. Embarrassingly for the Bush Administration, theIraq Survey Group failed to find any evidence of an ac-tive program for the development of weapons of mass de-struction.10 But, by that time, the images had done theirwork. According to Morris, the captions did the heavylifting, while the pictures merely provide[d] the windowdressing. But this account does not give full credit to the

    powerful way in which the images and the narrative worktogether, each reinforcing the other. The image, requiringinterpretive expertise, validates the expert interpreter. Andthe act of interpretation gives meaning to the image that isotherwise incomprehensible to the lay viewer. In this de-ceptively attractive circularity (that I call the image-expertbootstrap), the image tells us how important the expert is,while the expert tells us how important the image is. Sincenonexperts are hardly well placed to provide an alternativeinterpretation of the image, or to challenge the interpretersexpertise (or his expert interpretation), it is extremely hardfor them to break the circle.

    9. See http://morris.blogs.nytimes.com/2008/08/11/photography-as-a-weapon/?ref=opinion (last accessed September20, 2009).10. See the Report of the Iraq Survey Group (also knownas the Duelpher Report), available at https://www.cia.gov/library/reports/general-reports-1/iraq wmd 2004/index.html(last accessed January 10, 2010).

    AprilJune, Volume 1, Number 2, 2010 ajob Neuroscience 9

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    7/14

    AJOB Neuroscience

    NATIONAL SECURITY NEUROSCIENCE: PERILOUS TO

    THE VULNERABLE?

    A National Research Council report on Emerging Cogni-tive Neuroscience and Related Technologies issued in Au-gust 2008 cautioned that when using neurophysiology datato determine psychological states, it is important to recog-nize that acceptable levels of error depend on the differen-

    tial consequences of a false positive or a misidentification(NRC 2008, 19). The authors expressed particular concernthat the neural correlates of deception could be construedas inconvertible evidence of deception and therefore (mis-takenly) used as the sole evidence for making critical legaldecisions withlasting consequences, and notedthat insuf-ficient high-quality research using an appropriate researchmodel and controls has been conducted on new modali-ties of credibility assessment to make a firm, data-drivendecision on their accuracy (34).

    These concerns were exacerbated the following monthwhen reports emerged that a woman named Aditi Sharmahad been convicted in India of killing her former fiancewith arsenic, and that her conviction relied principally

    on evidence from brain electrical oscillation signaturetesting (EEOS)purportedly a variation on the electroen-cephalograph (EEG)-based technique of so-called brainfingerprinting developed and aggressively marketed byLawrence Farwell (Giridharadas 2008). Many commenta-tors have been understandably appalled that a judge wouldpermit such a travesty of justice, noting hopefully that suchan outcome should not be possible in the United States. Ju-dicial gatekeeping may well prevent such an incident fromrecurring in Europe or North America. A more immediateconcern in the United States, however, is the use of neuro-technologies in the national security context, where there isno judicial gatekeeper.

    In particular, an fMRI test result could be used to la-

    bel a detainee as a terrorista troubling prospect describedmore fully elsewhere (see Marks 2007b). One can imag-ine without great difficulty that an intelligence operativeseduced by colorful brain scans, pseudo-scientific explana-tory narrative, and media hypemight say the fMRIpicked him out as a liaror, worse still, as a terrorist.It is not difficult to see how that could influence the subse-quent treatmentand interrogation of such a detainee.Labelssuch as the worst of the worst (used to described to de-tainees at GuantanamoBay), andafortiorispecificlabelssuch as the mastermind of 9/11 and the 20th hijacker(ascribed to detainees Khalid Sheikh Mohammed and Mo-hamed Al Qahtani, respectively) led to abusive and, in themost extreme cases, life-threatening treatments. But a label

    invoking a much-hyped and little-understood technology,such as fMRI, is likely to be all the more powerful. Theseconcerns are highlighted by two factors.

    The first is the aggressive marketing of these technolo-gies, which may further exacerbate the undue confidencein and uncritical assessment of these technologies fueled bythespecialist language andimagesI have already described.We have seen this in the case of so-called fMRI-based lie

    detectionthe worst offender in the field being No LieMRI, a corporation that claimsaccuracy rates of 90%or moreand asserts (without a solid evidential basis) that its propri-etary technology is insensitive to countermeasures.11 Asimilar point can be made about aerosolized oxytocin too,and the colorful marketing strategies described earlier. Thesecond factor is the interest of the national security com-munity in neuroscience. Although it has been difficult tocorroborate the claim that fMRI has already been usedinconjunction with EEGto screen suspected terrorists (seeMarks 2007b),there is clearly interest in its use for this pur-pose (see ISB 2006). In January 2007, the Department ofDefenses Polygraph Institute was renamed the DefenseAcademy for Credibility Assessment. There appear to havebeen two rationales for the new titlefirst, an expansion ofthe portfolio of the institute to encompass the use of newertechnologies including but not limited to fMRI and, second,a shifting of the institutes mandate to address counterter-rorism.

    The National Research Councils recent report Oppor-tunities in Neuroscience for Future Army Applications ac-knowledged that there were challenges to the detection ofdeception due to individual variability and cultural dif-ferences in attitudes to deception (NRC 2009, 96). But thereport continues to pose the question is there some kind ofmonitoring that could detect if a subject being interrogatedis responding in a contrary to truth manner? (NRC 2009,96). This should serve as a potent reminder. LSD and, in-creasingly, thepolygraphmay seem consigned to theannalsof U.S. national security. But no one should doubt that theconcerns that have motivated their use are alive and well inthe neuroscience and national security communities.

    THE WAY FORWARD

    In the limited space available, I cannot attempt a compre-

    hensive ethical critique of national security neuroscienceaproject to which Canli and colleagues (2007) and their com-mentators make important contributions. Nor can I addressall the potential perils of national security neuroscience. Ihave focused on the population I consider most vulnera-ble, detainees, and I have discussed two core examples inrelation to that population. But there are clearly other vul-nerable populations, most notably, soldiers (or warfighters,as they are now called), whose freedom is constrained notsimply by their enlisted status, but also by their natural de-sire forrecognitionand advancement in themilitary. A fullerdiscussion of the potential hazards in relation to that pop-ulation can be found elsewhere, in particular in JonathanMorenos bookMind Wars(Moreno 2007).

    In his book, Moreno calls for the creation of a nationaladvisory committee on neurosecurity, staffed by profession-als who possess the relevant scientific, ethical, and legalexpertise. The committee would be analogous to the Na-tional Science Advisory Board for Biosecurity established

    11. See http://noliemri.com/products/Overview.htm (last ac-cessed September 20, 2009).

    10 ajob Neuroscience AprilJune, Volume 1, Number 2, 2010

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    8/14

    Neuroethics and National Security

    in 2004, which is administered by the National Institutesof Health (NIH) but advises all cabinet departments onhow to minimize the misuse of biological research. Sucha committee could certainly play a role in the oversight ofnational security neuroscienceif it hadthe authorityto mon-itor themisuseof neurosciencewithinas well as outsidegovernment. But in my view, the time has also come for abroader public debate about the legitimate nonclinical ap-plications of neuroscienceone that takes into account theconcerns addressed here, and seeks to learn from the abuseof medicine, behavioral psychology and polygraphy in thenational security context.

    If we are to have a meaningful discussion, we will haveto ask ourselves some difficult questions that explore thekinds of neuroscience research that are being funded andaddress the broader context in which that research takesplace. For example, many of us do not blink at the use ofbrain imaging to detect lies in detainees. In contrast, no oneadvocates theuse of thetechnology duringU.S. Senatehear-ings for nominees to the federal judiciary (although it didbriefly occur to me to write an op-ed during the confirma-tion hearings of Justice JohnRoberts makingthis suggestion,tongue firmly in cheek). Surely, the detection of an answercontrary to truth in such a context might be of some inter-est to the senators charged with the confirmation of federaljudges.

    I am, of course, not the first to make this kind of point.Commenting on fMRI-based research to screen children forpotentially violent behavior, the sociologist Troy Duster hasnoted that studies are not designed to capture the kindof diffuse, anonymous violence reflected in the behaviorof unscrupulous executives, traders, subprime lenders andso on (Duster 2008). It is tempting to add to the list thearms-length architects of torturous interrogation, and thelegal and health professionals whopurportedly exercisingtheir professional skill and judgmentapproved their use.

    Duster continues with more than a hint of sarcasm:

    But for the sake of argument, suppose we could monitor chil-drenand determine thatgreater activity in the prefrontal cortexmeans that they are likely to exhibit violent behavior. Surely,then, we should scan preteens to intervene in the lives of po-tential Enron-style sociopaths before they gut the pensions ofthe elderly, right? Oops, I guess I have the wrong target groupin mind. (B4)

    In this piece, Duster applies to recent neuroscience researchsome of the criticism that social activist Martin Nicolausdirected at sociologists and criminologists in 1968, roughlyparaphrased as you people have your eyes down and your

    hands up, while you should have your eyes up and yourhands down. He explains:

    Eyes down meant that almost all the research on devianceand crime was focused on the poor and their behavior, whilehands upmeant that the support for such research was com-ing from the rich and powerfulfrom foundations, the gov-ernment, and corporations. Conversely, of course, eyes upmeant turning ones research focus to the study of the patho-

    logical behavior of the elite and privileged, andhands downmeant giving more of a helping hand to the excluded, impov-erished, and disenfranchised. (B4B5)

    Although Duster does not address how neurosciencemight help the disenfranchised, it is not difficult to conjureother uses of MRI technologies that might redound to theirbenefit (including perhaps the provision of government-

    funded diagnostic services to the uninsured and the un-derinsured or a broader use of the technology to rescueand rehabilitate patients trapped in minimally consciousstates). But how we use these technologies, the wickedproblemsto use the language of Frank Fischer (Fischer2003, 128129)that we call on them to address, should bea matter of informed debate in which experts engage withand listen closely to the public. The neuroscience and neu-roethics communities must be frank with the public aboutthe potential, the limitations, and the perils of neuroscience.We should empower the public to challenge decisions re-garding the development and application of neuroscience(see Dickson 2000), and engage with them in figuring outthe road ahead. The educational and communication chal-

    lengesin this exercise should notbe underestimated.But weshould rise to them. If we fail to reconsider and refocus thegaze of neuroscience, we risk abandoning orworse stillimperiling the vulnerable. And if we do that, tomorrowshistorians and science studies scholars will, rightly, not lookkindly on us.

    REFERENCES

    Baumgartner T., M. Heinrichs, A. Vonlanthen, U. Fischbacher, and

    E. Fehr. 2008. Oxytocin shapes theneural circuitry of trust and trust

    adaptation in humans.Neuron58(4): 639650.

    Bennett, R. 2007. Endorsement by seminar military inter-

    rogators. Peace and Conflict: Journal of Peace Psychology 13(4):

    391392.

    Canli, T., S. Brandon, W. Casebeer et al. 2007. Neuroethics and

    national security.American Journal of Bioethics7(5): 313.

    Central Intelligence Agency Inspector General. 2004.Counterterror-

    ism, detention and interrogation activities (September 2001September

    2003). Available at: http://media.washingtonpost.com/wp-

    srv/nation/documents/cia report.pdf (accessed December 21,

    2009) (paragraphs 223225, pp. 9091).

    Choudhury, S., S. K. Nagel, and J. Slaby. 2009. Critical neuroscience:

    Linking neuroscience and society through critical practice.BioSoci-

    eties4: 6177.

    Dickson, D. 2000. The science and its public: The need for a third

    way.Social Studies of Science 30(6): 917923.

    Dumit, J. 2003.Picturing personhood: Brain scans and biomedical iden-

    tity.Princeton, NJ: Princeton University Press.

    Duster, T. 2008. What were you thinking? The ethical hazards of

    brain imaging studies. Chronicle ReviewOctober 10: B4B5.

    Fischer, F. 2003. Citizens, experts, and the environment: The politics of

    local knowledge.Durham, NC: Duke University Press.

    AprilJune, Volume 1, Number 2, 2010 ajob Neuroscience 11

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    9/14

    AJOB Neuroscience

    Follmann,M. 2003. Didthe Saudis know about 9/11. Salon.com.Oc-

    tober 18. Available at: http://dir.salon.com/story/news/feature/

    2003/10/18/saudis/print.html (accessed January 3, 2010).

    Giridharadas, A. 2008. Indias novel use of brain scans in

    courts is debated. New York Times September 15. Available at:

    http://www.nytimes.com/2008/09/15/world/asia/15brainscan.

    html? r=3&pagewanted=print (accessed January 4, 2010).

    Gudjonsson, G. 2003.The psychology of interrogations and confessions:A handbook. Chichester, England: John Wiley & Sons.

    Henson, R. 2005. What can functional imaging tell the ex-

    perimental psychologist? Quarterly Journal of Experimen-

    tal Psychology 58A(2): 193233. Available at: http://www.

    ai.uga.edu/tonysnod/neuroscience/Henson%282005%29.pdf

    (accessed January 4, 2010).

    Intelligence Studies Board (ISB). 2006. Educing information: Inter-

    rogation: Science and art foundations for the future. Washington, DC:

    NDIC. Available at: http://www.fas.org/irp/dni/educing.pdf (ac-

    cessed December 31, 2009)

    International Committee of the Red Cross. 2007. Report on the treat-

    ment of fourteen high-value detainees in C.I.A. custody.February 14.

    Available at: http://www.nybooks.com/icrc-report.pdf (accessedDecember 21, 2009).

    Joyce, K. 2008. Magnetic appeal: MRI and the myth of transparency.

    Ithaca, NY: Cornell University Press.

    Kosfeld, M., M. Heinrichs, P. J. Zak, U. Fischbacker, and

    E. Fehr. 2005. Oxytocin increases trust in humans. Nature

    435(7042): 673676. Available at: http://www.socialbehavior.

    unizh.ch/static/home/heinrichs/downloads/Oxytocin Increases

    Trust in Humans Nature.pdf (accessed January 3, 2010).

    Lee, H., A. H. Macbeth, J. H. Paganil, and W. S. Young, III. 2009.

    Oxytocin: The great facilitator of life. Progress in Neurobiology88(2):

    127151.

    Marcus, G., ed. 2005.Technoscientific imaginaries: Conversations, pro-

    files, and memoirs.Chicago: University of Chicago Press.Margulies, J. 2006. The more subtle kind of torment. Washington

    PostOctober 2:A19.

    Marks, J. H. 2006. 9/11 + 3/11+ 7/7= ? What counts in countert-

    errorism.Columbia Human Rights Law Review37: 559626.

    Marks, J. H. 2007a. Doctors as pawns? Law and medical ethics at

    Guantanamo Bay.Seton Hall Law Review 37: 711731.

    Marks J. H. 2007b. Interrogational neuroimaging in counterterror-

    ism: A no-brainer or a human rights hazard? American Journal of

    Law & Medicine33: 483500.

    Mayer, J. 2008.The dark side: The inside story of how the war on terror

    turned into a war on American ideals. New York: Doubleday.

    McCabe, D., and A. Castel. 2008. Seeing is believing: The effect ofbrain images on judgments of scientific reasoning.Cognition 107:

    343352.

    Moreno, J. D. 2006. Mind wars: Brain research and national defense.

    Washington, DC: Dana Press.

    National Research Council. 2003. The polygraph and lie detection.

    Washington, DC: National Academies Press.

    National Research Council. 2008. Emerging cognitive neuroscience

    and related technologies. Washington, DC: National Academics

    Press.

    National Research Council. 2009. Opportunities in neuroscience for

    future Army applications. Washington, DC: National Academies

    Press.

    Owen, A. M., and M. R. Coleman. 2008. Functional neu-

    roimaging of the vegetative state. Nature Reviews Neuroscience

    9: 235243. Available at: http://www.wbic.cam.ac.uk/mrc30/

    Owen Coleman NNR 2008.pdf (accessed September 20, 2009).

    Roskies, A. 2008. Neuroimaging and inferential distance. Neu-

    roethics1: 1930.

    Schiff, N. D., J. T. Giacino, K. Kalmar, et al. 2007. Behavioural im-

    provements with thalamic stimulation after severe traumatic brain

    injury.Nature448: 600603.

    Shane, S. 2009. Interrogation Inc., 2 U.S. architects of harsh tacticsin 9/11s wake. New York Times August 12. Available at: http://

    www.nytimes.com/2009/08/12/us/12psychs.html? r=5&hp=&

    pagewanted=print (accessed September 20, 2009).

    Soufan, A. 2009. What torture never told us. New York

    Times September 5. Available at http://www.nytimes.com/

    2009/09/06/opinion/06soufan.html? r=2 (accessed December 21,

    2009).

    Trout, J. D. 2008. Seduction without cause: Uncovering explanatory

    neurophilia. Trends in Cognitive Sciences 12(8): 281282.Available at:

    http://www.jdtrout.com/sites/default/files/ExplanatoryNeuro-

    philia 0.pdf (accessed September 20, 2009).

    Warrick, J. 2008. Detainees allege beingdrugged, questioned.Wash-

    ington Post April 22. Available at: http://www.washingtonpost.com/wp-dyn/content/article/2008/04/21/AR2008042103399 pf.

    html (accessed December 21, 2009).

    Weisberg, D. S., F. C. Keil, J. Goodstein, E. Rawson, and J. R. Gray.

    2008. The seductive allure of neuroscience explanations.Journal of

    Cognitive Neuroscience20(3): 470477.

    Weisman, S. R. 2005. Powell calls his U.N. speech a lasting

    blot on his record. New York Times September 9. Available at:

    http://www.nytimes.com/2005/09/09/politics/09powell.html

    (accessed January 4, 2010).

    Weiss, K., and K. Aldridge. 2003. What stamps the wrinkle deeper

    on the brow?Evolutionary Anthropology12: 205210.

    Wolpe, P., K. R. Foster, and D. D. Langleben. 2005. Emerging tech-nologies for lie-detection: Promises and perils. American Journal of

    Bioethics5(2): 39.

    12 ajob Neuroscience AprilJune, Volume 1, Number 2, 2010

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    10/14

    Copyright of AJOB Neuroscience is the property of Taylor & Francis Ltd and its content may not be copied or

    emailed to multiple sites or posted to a listserv without the copyright holder's express written permission.

    However, users may print, download, or email articles for individual use.

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    11/14

    AJOB Neuroscience, 1(2): W1W3, 2010

    Copyright c Taylor & Francis Group, LLC

    ISSN: 2150-7740 print / 2150-7759 online

    DOI: 10.1080/21507741003712430

    Correspondence

    Neuroconcerns: Some Responses to My

    CriticsJonathan H. Marks,The Pennsylvania State University and Harvard University

    Neuroscience has its limits. Perhaps oxytocin can dimin-ish neuroskepticism. But no psychoactive drug or neuro-technologicalinnovation can enable me to engage meaning-fully in around a thousand words with more than a dozencommentariesembracing the work of Foucault (Thomsen2010) and Spinoza (Bentwich 2010), in addition to the lat-est scholarship in neuroscience and neuroethics, some ofwhich appeared after my article was submitted for publi-

    cation (e.g., Illes et al. 2010). Although my responses areshaped by these thoughtful commentaries, the latter raisemany important points to which I cannot do justice here.

    NEUROSKEPTICISM

    Let me begin by saying something about the neuroskepti-cism that informed my piece (Marks 2010). It is certainlynot an ideology, as one of my harshest critics alleges (Keane2010). Nor is it meant to be construed as some kind of eth-ical theory. I am even hesitant to elevate it to the status ofmethod as the members of the Stanford InterdisciplinaryGroup in Neuroscience and Law (SIGNAL) suggest in theirinsightful commentary (Lowenberg et al. 2010). As I said inthe article, neuroskepticism is a perspective. One might

    also call it a sensibility or an orientation. Since I did notexpect all my readers to share the perspective, I wanted todeclareit up front.I am glad to acknowledge, as theSIGNALauthors suggest, that I am alsoneuroconcerned: concernedabout whether even scientifically reliable neurosciencecould be used to cause harm or to further harmful ends(Lowenberg at al. 2010). This acknowledgment should helpaddress the views of some commentators who thought Iplaced too great an emphasis on bad science as the sourceof concern (Thomsen 2010; Strous 2010). There are clearlyserious ethical issues, whether the science is sound or, asFisher (2010) puts it, absolute junk.

    ETHICS AND SCIENCE

    The purpose of my article was not to provide a compre-hensive ethical assessment of neuroscience and nationalsecurityan endeavor to which Canli and colleagues(2007)and their commentators make significant contributions.Rather, it was to offer an explanatory account that might

    Address correspondence to Jonathan H. Marks, Edmond J. Safra Faculty Fellow in Ethics, Harvard University, 79 JFK Street, Taubman,Cambridge, MA 02139, USA. E-mail: Jonathan [email protected]

    inform neuroethical debate. Such a debate must obviouslyaddress the legitimacy of the means by which and the endsfor which neuroscience is applied (Lowenberg et al. 2010).However, I do not (as Benanti [2010] suggests) consider thatethical questions remain, in a certain way, external to neu-roscience, because they are related only to practical use ofneuroscience. Ethics must infuse science from inceptionfrom the first flicker of an ideathrough testing and de-

    velopment, to myriad actual and potential applications. AsSchienke and colleagues (2009) argue, a comprehensive ac-count of scientific research ethics needs to address threedistinct but related spheres of concern: procedural ethics(that is, the responsible conduct of research, which includesissues of falsification, fabrication, plagiarism, care of hu-man and animal subjects, and conflicts of interest), intrin-sic ethics (comprising issues internal to or embedded inthe production of a given inquiry or mode of analysis), andextrinsic ethics (that is, issues arising from the applica-tion of science to policy or from the impact of science andtechnology on society).

    I agree that the slippage from military into civilian ap-plications of neuro-technologies presents ethical issues (see

    Marchant and Gulley 2010). But, of course, applications thatremainwithinthe national security domainstill raise impor-tantethical concerns.For example, as Kirmayer, Choudhury,and Gold (2010) note, there is a risk that neuroscience canbecome a screen on which to project our prejudices andstereotypes and that it may shift the focus away from theorigins of terrorism as social and political processes. Thishas implications for national security too. As John Horganhas argued, recognition of these social processes is vital toour understanding of terrorism and to the crafting of effec-tive counterterrorism policies (e.g., Horgan 2008).

    ETHICS AND NEUROHYPE

    My paper didnot focuson the problem of neurohype andthe related ethical responsibilities of the scientific commu-nity. However, Rippon and Senior take umbrage with theimpression [my article] gives that the neuroscientific com-munity is not policing its own house (Rippon and Senior2010)aphrase my articledid notemploy. Atthe sametime,

    ajob Neuroscience W1

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    12/14

    AJOB Neuroscience

    Rippon and Senior acknowledge that neuroscientists needto be more aware of thepotential dangers of overselling ourwares and that the brain imaging community needs to bemore vigorous in communicating its concerns to the publicat large and, more importantly to thepolicy makersand fun-ders. I am grateful for these acknowledgments and wouldcommend to Rippon and Senior the discussion of neuro-hype in Caulfield, Rachul, and Zarzecny (2010; and theworks cited therein). I would also endorse Nagels plea forresponsibility in sciencea plea that recognizes that somescientists live up to such responsibilities better than othersand, more importantly, that there are systemic factors thatcan promote problematic behaviors (Nagel 2010).

    EMOTION, COGNITIVE BIASES, AND

    COUNTERTERRORISM

    Bentwich (2010), drawing on the work of Spinoza, arguesI pay too little attention to the pernicious linkage amongfear, superstition, and prejudice in the national securitycontext. Although I did not explore these issues in my tar-get article, I have done so at substantial length elsewhere. In

    Marks (2006), I argue that our emotional responses to coun-terterrorism and associated cognitive biases have an impactnot just on individual behavior but also (through a varietyof social mechanisms) on counterterrorism policy and prac-tice. Resulting policies tend to violate human rights, andoften fail to address real threats: see Marks (2006), wheremy debt to Spinoza is mediated by the work of neuroscien-tist Antonio Damasio, on which I draw (Damasio 2003).

    ETHICS AND HUMAN RIGHTS

    Bentwich (2010) concludes that the answers to these prob-lems are not found in the jurisdiction of neuroethics, butrather by reasserting human rights and constitutional civilliberties. In other work on neuroscience and national secu-

    rity, I too have emphasized the importance of human rights(see, e.g., Marks 2007b). More broadly, I have also advocatedhuman rights impact assessments of counterterrorism poli-cies (Marks 2006) and used international human rights lawto critique health professionals who are complicit in de-tainee abuses (Marks 2007a). However, I do not believe thatprofessional ethics is an entirely autonomous enterprise. Inforthcoming work, I intend to elaborate more fully on therelationship between human rights and professional ethics.For now, I note that the American Association for the Ad-vancement of Science (AAAS) Science and Human RightsProgram recognizes the centrality of human rights to theethics of science, and its new coalition will explore howgreater attention to human rights might also result in im-

    provements in scientific process and practice (AAAS 2010).

    THE PROSPECTS FOR AND LIMITS OF FURTHER

    DISCOURSE

    I am heartened by the lively debate that my article has pro-voked inthe pages of theAmerican Journal of Bioethics: Neuro-science. Consistent with the objectives of my piece, I encour-age my colleagues to work with me to take this discussion

    beyond the pages of this journal and into the larger pub-lic domain. I acknowledge, as Lowenberg and colleagues(2010) and Giordano (2010) point out, that national secu-rity considerations may place limits on what may be dis-cussed in public. But the presumption should be in favor ofopenness, and there isin any eventmore than enoughinformation in the public domain about which to conductmeaningful discussions.

    REFERENCES

    American Association for the Advancement of Science (AAAS).

    2010. AAAS science and human rights coalition. Available at:

    http://shr.aaas.org/coalition/index.shtml (accessed February 9,

    2010).

    Benanti, P. 2010. From neuroskepticism to neuroethics: Role moral-

    ity in neuroscience that becomes neurotechnology. AJOB Neuro-

    science1(2): 3940.

    Bentwich, M. 2010. Is it really about neuroethics? On national secu-

    rity, fear, and superstition.AJOB Neuroscience1(2): 3032.

    Canli, T., S. Brandon, W. Casebeer, et al. 2007. Neuroethics andnational security.American Journal of Bioethics7(5): 313.

    Caulfield, T., C. Rachul, and A. Zarzeczny. 2010. Neurohype and

    the name game: Whos to blame? AJOB Neuroscience1(2): 1315.

    Damasio, A. 2003. Looking forSpinoza: Joy, sorrowand thefeelingbrain.

    Orlando, FL: Harcourt Press.

    Fisher, C. E. 2010. Brain stimulation and national security: Consid-

    ering the narratives of neuromodulation. AJOB Neuroscience 1(2):

    2224.

    Giordano, J. 2010. Neuroscience, neurotechnology and national se-

    curity: The need for preparedness and an ethics of responsible ac-

    tion.AJOB Neuroscience1(2): 3536.

    Horgan, J. 2008. From profiles toPathwaysand roots toRoutes: Per-

    spectives from psychology on radicalization into terrorism.AAPSS

    Annals618: 8094.

    Illes, J., M. A. Moser, J. B. McCormick, et al. 2010. Neurotalk:

    Improving the communication of neuroscience research. Nature Re-

    views Neuroscience11: 5169.

    Keane, J. 2010. Neuroskeptic or neuro-ideologue? AJOB Neuro-

    science1(2): 3334.

    Kirmayer, L., S. Choudhury, and I. Gold. 2010. From brain image

    to the Bush doctrine: Critical neuroscience and the political uses of

    neurotechnology. AJOB Neuroscience1(2): 1719.

    Lowenberg, K., B. M. Simon, A. Burns, L. Greismann, J. M.

    Halbleib,G. Persad, D. L. M. Preston, H. Rhodes, and E. R. Murphy.

    2010. Misuse made plain: Evaluating concerns about neuroscience

    in national security.AJOB Neuroscience1(2): 1517.

    Marchant, G., and L. Gulley. 2010. Neuroscience and the reverse

    dual-use dilemma.AJOB Neuroscience1(2): 2022.

    Marks, J. H. 2006. 9/11 + 3/11 + 7/7= ?: What counts in countert-

    errorism.Columbia Human Rights Law Review37(2): 559626.

    Marks, J. H. 2007a Doctors as pawns? Law and medical ethics at

    Guantanamo Bay.Seton Hall Law Review 37: 711731.

    W2 ajob Neuroscience AprilJune, Volume 1, Number 2, 2010

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    13/14

    Neuroethics and National Security

    Marks, J. H. 2007b Interrogational Neuroimaging in Counterterror-

    ism: A No-Brainer or a Human Rights Hazard?" Am. J. Law &

    Med.,33: 483500.

    Marks, J. H. 2010. A neuroskeptics guide to neuroethics and na-

    tional security.AJOB Neuroscience1(2): 412.

    Nagel, S. 2010. Critical perspective on dual-use technologies and a

    plea for responsibility in science.AJOB Neuroscience1(2): 2728.

    Rippon, G., and C. Senior. 2010. Neuroscience has no place in na-tional security.AJOB Neuroscience1(2): 3738.

    Schienke, E. W., N. Tuana, D. A. Brown, et al. 2009. The role of

    the NSF broader impacts criterion in enhancing research ethics

    pedagogy.Social Epistemology,23(34): 317336.

    Strous, R. (2010). Response to a neuroskeptic: Neuroscientific en-

    deavor versus clinical boundary violation.AJOB Neuroscience1(2):

    2426.

    Thomsen, K. 2010. A Foucauldian analysis of A neuroskeptics

    guide to neuroethics and national security.AJOB Neuroscience 1(2):2930.

    AprilJune, Volume 1, Number 2, 2010 ajob Neuroscience W3

  • 8/13/2019 A Neuroskeptic's Guide to Neuroethics and National Security

    14/14

    Copyright of AJOB Neuroscience is the property of Taylor & Francis Ltd and its content may not be copied or

    emailed to multiple sites or posted to a listserv without the copyright holder's express written permission.

    However, users may print, download, or email articles for individual use.