brain–computer interfaces and dualism: a problem of brain, mind, and body
TRANSCRIPT
ORIGINAL ARTICLE
Brain–computer interfaces and dualism: a problem of brain,mind, and body
Joseph Lee
Received: 28 January 2013 / Accepted: 23 May 2014
� Springer-Verlag London 2014
Abstract The brain–computer interface (BCI) has made
remarkable progress in the bridging the divide between the
brain and the external environment to assist persons with
severe disabilities caused by brain impairments. There is
also continuing philosophical interest in BCIs which
emerges from thoughtful reflection on computers,
machines, and artificial intelligence. This article seeks to
apply BCI perspectives to examine, challenge, and work
towards a possible resolution to a persistent problem in the
mind–body relationship, namely dualism. The original
humanitarian goals of BCIs and the technological inven-
tiveness result in BCIs being surprisingly useful. We begin
from the neurologically impaired person, the problems
encountered, and some pioneering responses from com-
puters and machines. Secondly, the interface of mind and
brain is explored via two points of clarification: direct and
indirect BCIs, and the nature of thoughts. Thirdly, dualism
is beset by mind–body interaction difficulties and is further
questioned by the phenomena of intentions, interactions,
and technology. Fourthly, animal minds and robots are
explored in BCI settings again with relevance for dualism.
After a brief look at other BCIs, we conclude by outlining a
future BCI philosophy of brain and mind, which might
appear ominous and could be possible.
Keywords Brain–computer interface (BCI) �Dualism � Intentions � Interactions � Brain � Mind
1 Introduction
The brain–computer interface (BCI) has advanced appre-
ciably (Lin et al. 2010; Nicolas-Alonso and Gomez-Gil
2012) with the goal to build direct functional interfaces
between the brain and artificial devices such as robotic
limbs and computers to assist severely handicapped
patients (Lebedev and Nicolelis 2006). There is continuing
philosophical interest in BCIs (de Kamps 2012; Kyselo
2013), known also as BMIs or brain–machine interfaces
(Lee et al. 2013).1
At the same time, AI questions have strong affinity with
philosophy of mind (Abramson 2011). One seemingly
intractable issue is the mind–body problem, which colours
much of the discourses, particularly due to conscious
mental phenomena (Nagel 1974; McGinn 1989), and
‘‘finding a place for the mind in a world that is funda-
mentally and essentially physical’’ (Kim 1998, p. 5), and
increasingly focussed on the brain (Dumit 2004; Weisberg
et al. 2008).
This article seeks to apply BCI perspectives to investi-
gate, challenge, and show a path to resolving the persistent
problem of the mind–body relationship while being atten-
tive to the human dimensions. It is the both the technical
ingenuity and humanitarian purposes of BCIs, which
makes them surprisingly helpful. We begin from the
J. Lee (&)
Flinders University, G.P.O. Box 2100, Adelaide, SA 5032,
Australia
e-mail: [email protected]
1 If the neural signals proceed to a machine such as a robot and not to
a computer, the term BMI was used (Donoghue 2008). The terms are
nowadays interchangeable. Other terms are neural interface systems
(Hatsopoulos and Donoghue 2009), and neuroprosthetics which uses
neural interface systems to control robotic limbs to perform three-
dimensional movements (Hochberg et al. 2012; Kwok 2013). All BCI
systems require some type of training: learned voluntary control or
cognitive voluntary modulation (Birbaumer and Cohen 2007). In this
article, we use BCI for convenience.
123
AI & Soc
DOI 10.1007/s00146-014-0545-8
neurologically impaired person, the problems encountered,
and some ground-breaking responses from computers and
machines. Secondly, the interface of mind and brain is
explored through two preliminary clarifications: direct and
indirect BCIs, and the nature of thoughts. In philosophy of
mind debates, dualism is cornered by issues of mind–body
interaction. These are addressed in the third section along
with the phenomena of intentions. Fourthly, animal minds
and robots are considered in association with BCIs; these
also pose questions for dualism.
BCI studies also use healthy subjects (Tan et al. 2014), a
feature in commercial applications in nondisabled settings,
e.g. games (Gurkok et al. 2013). Although most of our
discussions focus on BCIs with rehabilitation and commu-
nication purposes, the fifth section is a brief look at other
BCI uses (Morris 2004) as it applies to mind–body matters.
2 The impaired person, a problem, and computer–
machine responses
Locked-in syndrome (LIS) is characterised by anarthia or
lack of voluntary speech, and quadriplegia, or the inability
to move limbs against gravity (Haig et al. 1987). Con-
sciousness and vertical eye movement are preserved. Pupil
size responses can be used to communicate with LIS
patients, who typically have severe motor impairment due
to brainstem stroke aetiology (Stoll et al. 2013). Other
examples of paralysis due to neurological disorders include
amyotrophic lateral sclerosis (ALS), where there can be
loss of all communication channels including eye move-
ments (De Massari et al. 2013).
Laureys et al. (2005) point out that novelists knew about
the locked-in condition before the medical community, e.g.
Emile Zola’s 1868 novel Therese Raquin is about a pa-
ralysed woman who ‘‘was buried alive in a dead body’’ and
‘‘had language only in her eyes’’ (p. 497). LIS patients are
acutely noncommunicative, being in ‘‘the terrifying situa-
tion of an intact awareness in a sensitive being, experi-
encing frustration, stress and anguish, locked in an
immobile body’’ (p. 505). These people face a real life
problem of mind and body. Subjects, even those locked-in,
may live a long life, but with personal, social, and eco-
nomic burdens of their disabilities (Wolpaw et al. 2002).
2.1 Problems of mind and body
While pondering these realities, we can draw a parallel link
with a classic puzzle in philosophy of mind, the mind–body
problem. It sometimes means the traditional problem of
Cartesian or substance dualism: the divide between body
and mind. Another position is ‘‘radical monism’’ where
mental processes are identical to bodily processes
(Sartenaer 2013). Yet it can also refer to the ‘‘difficulty’’
that any materialist, dualist, or idealist philosophy meets in
explaining the nature of mind and its relationship to the
body (Kim 2008). ‘‘Anyone who has philosophical interest
in the relationship between mind and body can be said to
have the ‘mind–body problem’’’ (p. 439). Presently, the
answer as to whether minds are physical is generally
understood to favour physicalism; the question is that given
that all substances are physical, whether mental properties
are reducible to physical properties (Schneider 2013).
It is said, ‘‘All dying people are Cartesian dualists’’
(Hustvedt 2013). Illness makes almost everyone exposed to
a mind/body split, where a seemingly independent internal
narrator becomes, ‘‘a floating commentator on the goings-
on, while the symptoms of disease wreak havoc on the poor
mortal body. Subjective experience often includes a self
that observes illness, even though the very idea of the self
remains a philosophical and scientific conundrum’’ (p.
169). These circumstances especially confront patients
with neurodegenerative diseases (Anonymous 2013).2
The philosophical mind–body problem has resonances
with the actual problems faced by LIS patients. LIS pro-
vides a new context for analysing problems in philosophy
of mind, namely disability and assistive technology for
human persons.
2.2 Computer–machine responses
Despite extreme motor handicaps, patients with LIS can
find life worth living, with the necessary support and
communication devices (Lule et al. 2009). BCIs present
challenges to dualism and uncovers issues about the status
and relationship of mind and brain.
The technology exploiting event-related potentials in the
brain has been used to: select letters (Farwell and Donchin
1988), icons or to control cursor movements (Wolpaw et al.
2000), and words (Sellers and Donchin 2006). These BCIs
are noninvasive and produce a few letters per minute,
which experts regard as slow but are satisfactory for
present users, while invasive intracranial and intracortical
methods continue their development (Brumberg and
Guenther 2010).
Other BCI applications include technologies for: con-
trolling a wheelchair (Chai et al. 2012), to command a
humanoid robot to perform tasks, e.g. fetching an object
(Bell et al. 2008), a BCI mouse-based Internet web browser
2 The anonymous author anticipates the question: ‘‘So why am I
writing this piece anonymously? Because I don’t want to be known to
the scientific community as ’Parkinson’s guy’ before I am known as a
scientist’’ (p.30). The article notes that the author is a neuroscience
professor at a major university in the USA and that he blogs at
parklifensci.blogspot.com and tweets at @Parklifensci. e-mail:
AI & Soc
123
which emulates a computer mouse (Yu et al. 2012), a BCI
for playing Hangman (Hasan and Gan 2012), and ‘‘brain
painting’’ via BCI based on user-centred design (Zickler
et al. 2013). Freeing persons with locked-in minds to be
expressively artistic may even help to answer questions
about the nature of computer art (Lopes 2010).
More practically, control of an electrically driven hand
orthosis to restore hand function is possible via BCI too
(Ortner et al. 2011). The technology combines with neuro-
prostheses which use functional electrical stimulation (FES)
to restore hand, finger, and elbow function for users with
high-level spinal cord injury (Rohm et al. 2013). However, a
review of rehabilitation of gait after stroke found electro-
encephalography (EEG)-based BCIs limited to the rehabil-
itation of upper limbs, particularly hand movements.
Similarly, only a few studies demonstrated a real effect of
BCI usage on motor recovery (Belda-Lois et al. 2011).
Besides rehabilitation applications, there are diagnostic
uses of BCI. In patients with disorders of consciousness,
BCIs can be a means of detecting consciousness, e.g. in a
minimally conscious state (MCS) and vegetative state/
unresponsive wakefulness syndrome by detecting response
to communications (Lule et al. 2013). Before investigating
brain, mind, and body from BCI perspectives, two impor-
tant issues need clarification.
3 Direct and indirect BCIs
It is worth analysing the two types of BCIs: direct brain
contact achieved by invasive means; or indirect measure-
ment and noninvasive. While functionally they both
achieve desired outcomes for the user, the distinction has
implications for thinking about brain, body, and mind.
Most human applications of BCI such as an EEG are
indirect and noninvasive, that is, not measuring brain
activity by physical brain contacts.
But there are human applications, e.g. in silent speech
communication, where devices use direct neural signals,
particularly those with intracortical electrodes (Brumberg
et al. 2010). These require invasive neurosurgery: crani-
otomy and placing a recording electrode on the surface of
the cerebral cortex for electrocorticography (ECoG) or an
extracellular microelectrode into the cerebral cortex for
single unit activity and local field potentials.
One way to express the difference is being either ‘‘in the
brain’’ (direct) and being ‘‘outside the brain’’ (indirect)
BCIs.3 The indirect BCIs record signals from the brain but
these are detected through skull and scalp. Thus, there are
extra layers both physically from the brain material above
the cerebral cortex, and there is a layer of interpretation or
translation needed to extract meaning. Pribram (1998)
appears to acknowledge direct and indirect methods when
he sees a miracle where meaning can be garnered from
recording of brain electrical activity. ‘‘Imagine what you
might learn from placing electrodes on top of a computer to
determine which program is in operation (or even whether
the program is in hexadecimal, ASCII, or C??). Or, take a
single wire and stick it into the guts of the computer (and
hope you won’t short anything out) to find out in machine
language what is going on’’ (pp. 223–224).
According to Engel et al. (2005), invasive recordings are
indispensable, offering the only access to the human brain
at cellular resolution, even providing single-cell correlates
of subjective experience, thus encroaching on previously
exclusive domains for philosophy and the humanities.
Such direct measurement BCIs are more commonly
used in animal research as we shall see, and indirect
methods such as EEG are widespread in human BCIs. In
humans, when the recordings are direct, e.g. intracortical
electrodes, such data from the brain present compelling
evidence for the centrality of the brain, and its undeniable
biological and physicalist grounds for: the contents of
mind, phenomenal consciousness, motor and other inten-
tions, the thoughts of a person, and goals achievable via
BCI outputs.
Perhaps a comparable though imperfect analogy is the
differences between a hearing-impaired person and a
cochlear implant who watches a film, hears speech sounds
and is able to understand the meaning, as opposed to that
same person reading visual descriptions and subtitles of the
footage, or seeing someone communicating word pictures
via sign language. All three methods convey meaning, but
the cochlear implant is most direct means to hearing real-
time spoken words, since the electrode array is inserted
directly into the cochlea via invasive surgery. However,
reading text on a screen and interpreting sign language is
one or two steps removed from the speaker. Equally, just as
signal processors convert acoustic vibrations such as
speech into electrical stimuli to the auditory nerve (Ru-
binstein 2004), so too electrical signals direct from the
brain are converted by BCI into meaningful action, and
indirectly through EEG-BCI methods.
Hence, direct BCIs eliminate any risk that brain signals
may be ‘‘lost in translation’’, which is likely with indirect
methods. Direct contact represents a higher standard for
what can be inferred. It would be interesting to further test
conclusions drawn from EEG-BCIs data by designing
(where ethically possible) a parallel study involving intra-
cortical brain recordings, most often during neurosurgery
when the cranial vault is open. All things considered BCIs
3 These interesting comparative titles were suggested by an anony-
mous reviewer, who likens the situation to ‘‘being in the mind of
someone’’ and ‘‘trying to infer what is happening in the mind’’. We
return to this in the conclusions.
AI & Soc
123
are quite devastating to claims that mind and body are two
different, noninteracting substances, i.e. substance dualism.
4 Mind–brain, and brain–computer interfaces
It can be asked, ‘‘What is mind? No matter. What is matter?
Never mind’’ (Edelman 1992, p. 3). Alternatively, there is a
reductive drive in scientific research that recognises the brain
in preference to the mind. Some portray the mind as essen-
tially the physiochemical brain functions which explain
emotions and the great works of humankind, thus ‘‘glands
secrete, stomachs digests, brains mind’’ (Kron 2012, p. 219).
This mind–brain identity theory is a continuing position
(Aranyosi 2011; Kaitaro 2004; Rockwell 2007). Smart
(1963) thinks a tenable philosophy of mind should be
compatible with materialism, because how could a non-
physical property or entity suddenly emerge during evo-
lution? ‘‘No enzyme can catalyse the production of a
spook!’’ (p. 660).
While the mind–body problem is generally viewed as
one-directional: how the brain produces conscious mental
states; nevertheless, scientific medicine recognises the
phenomena of beliefs in curing illness, e.g. placebo effect;
mental states can affect bodily functioning (Kihlstrom
2008). Likewise, the phenomena of emotional self-regula-
tion, psychotherapy, and subjective intentional content of
mental processes, such as beliefs, feelings, and volition, all
markedly can influence brain function and plasticity
(Beauregard 2007).
Rather than mind–body, the mind–brain problem
(Schimmel 2001) presumes a mind–brain correlation and
some form of identity theory. But the problem is that science
probably cannot furnish an acceptable account of conscious-
ness using neuroscience terminology; that is, ‘‘there does not
seem to be a way. Any attempt at explanation inevitably ends
up leaving out the mind’’ (p. 485).
Amidst BCI research, there is the notion of thought-
controlled and thought-based technologies (Pfurtscheller
et al. 2003). Some scholars emphasise that physical
dimensions of the brain ought to complement the mental
dimensions (Fingelkurts et al. 2010). In the pioneering of
BCIs, there was the concept of a thought translation device
to describe a BCI using slow cortical potentials of patients
with ALS and total motor paralysis (Kubler et al. 1999;
Scherer and Pfurtscheller 2013). The BMI/BCI is intended
‘‘to translate ‘thought into action’ with brain activity only’’
(Birbaumer 2006, p. 529).
However, whilst a BCI enables a user to trigger actions
by ‘‘thoughts’’, Scherer et al. (2013) observe that BCIs are
‘‘not thought-reading devices or systems able to literally
translate arbitrary cognitive activities. On the contrary,
only well characterised a priori defined brain activity
patterns can be detected’’ (p. 317). Their view seems to
imply that thoughts are more distant from material reach.
Perhaps in the sense that BCIs are not a clairvoyant-like
device, which rapidly deduces people’s thoughts. Rather, it
entails supervised machine learning and pattern recognition
methods.
Are they universal thoughts which can be simply
deduced from all brains? Scherer et al. appear to adopt a
narrow interpretation of the mind–body problem, relying
on translation rules and inductive processes.
Those against BCIs as ‘‘thought-readers’’ could cite the
AI Chinese room thought experiment of Searle (1980) who
argues against strong AI. A person does not understand a
word of Chinese at all (semantics) but follows instructions
to match up various inputs (syntax). Likewise, the pro-
grams used to translate thoughts into action needs to be
conceptualised, coded, and validated; thus, inductive rea-
soning is needed. Frequently, the user requires training
with the BCI. Therefore, it is not pure logical deduction. It
is like the hearing-impaired person who can lipread but
finds diverse lip patterns, accents, etc. It has a human
subjective component.
For instance, with completely locked-in states (CLIS),
e.g. due to ALS, any remaining observable controllable
muscles like eye muscles also fail (Murguialday et al.
2011). BCI communications appear as the only means to
prevent the ‘‘extinction of thought’’. But auditory and
proprioceptive systems still manifest brain responses which
inform recommended design of BCI platforms for LIS and
CLIS patients (p. 932). Its status is not deductive or else
there would be off-the-shelf algorithms for thought
translation.
A mediating position is that in the BCI environment, it is
the human subject, locked in the body, who retains mental
processes especially volitional ones. It can be further
clarified that the brain in BCI is neuropsychologically
linked with mind, memory, emotion, and reasoning. The
brain is not a thought, yet it is necessary for thoughts and
the operation of BCIs, and for minds to exist. BCIs eluci-
date the nature of thought, perhaps more nuanced than the
pioneers had envisaged: thought translation really means
to take the thinking of a person and convert that into action,
via a process of translation which involves complex ana-
lysis of recorded brain signals. Yet these are physical, and
therefore, thoughts can be ‘‘read’’ as it were. Now to bring
BCI insights to the issue of dualism.
5 Technological challenges to dualism: interaction
and intentions
The mind–body problem is accentuated for dualism, in
short, ‘‘how can you move your bodily part by willing that
AI & Soc
123
you move it, given that moving a bodily part is a physical
event while thinking that you move a bodily part is a
mental event?’’ (Nagasawa 2012, p. 357). This section
reinforces the physical realities, which undermine dualist
ideas.
5.1 Interaction of mind, brain, and body
This ‘‘interaction problem’’ is held as the most troubling
for dualism, since wholly nonspatial mental events could
not possibly cause physical motion like billiard balls cause
physical motion (Lycan 2009). Sometimes it seems more
obvious, e.g. pain (Campbell and Edwards 2009). Yet
physicians tend to view a problem of the mind, with no
physiological correlates, as less real compared with organic
or bodily symptoms (Kendler and Campbell 2009). This is
regarded as a false dichotomy since mental illness such as
most illnesses is not split between the body (material) and
mind (immaterial), but rather is essentially bio-psychoso-
cial (Ungar and Knaak 2013).
BCIs clearly exemplify the physical basis of interac-
tions. The hardware of BCI that records brain signals
invasively or noninvasively (Brunner et al. 2011). The
software then translates the signals from the brain into
output commands for a device and generates feedback to
the user. Interaction is the basis of BCI functioning; it
challenges dualism by upholding executable and now
reachable intentions by BCIs, that is, there is real brain/
mind/body interaction and vast technological support for
materialism.
5.2 Intentions
Take the human body, without BCIs: To make a voluntary
movement, the motor system needs to convert a desired
goal such as to drink coffee into a plan of action, to reach to
a coffee cup, and finally into the spinal motoneuron activity
that generates the necessary muscle contractions (Green
and Kalaska 2011). This entails mechanisms, such as feed-
forward, which is a predictive neural process, to produce
control signals that drive the arm to a desired state. The
processes are performed by neurons and distributed
throughout the supraspinal motor system, converting the
goal into a motor command and then transformed by spinal
cord circuits into muscle activity (pp. 61–62).
McFarland (2008) sees intentionality as a property of
mind which is directed at objects and states of affairs in the
world or about these things, e.g. beliefs, desires, and
intention. Human goal-directed behaviour is also centred
on intention, a mental state associated somehow to phe-
nomena like agency, decision, belief, and desire (Thinnes-
Elker et al. 2012).
An enduring dualist tenet is that mental properties are
irreducible and not physical. Intentions are a traditionally a
property of mind and seemingly different to nature of
bodies. Turning to BCIs, Gurkok and Nijholt (2012) state
that ‘‘computers cannot read our minds, but BCIs can infer
our mental/emotional states and intentions by interpreting
our brain signals’’ (p. 292). This view, like that of Scherer
et al. (2013) discussed above, is questionable, if we con-
sider mental states and intentions as properties of mind. It
can be confidently asserted, contrary to Gurkok and Nij-
holt, that BCI inferences involve ‘‘reading’’ the mind
through ‘‘interpreting our brain signals.’’
These authors do not appear to recognise the mind–body
problem or overlook it entirely. It reads as if by minds
Gurkok and Nijholt mean something akin to secret inner
thoughts, or perhaps the unconscious mind in the Freudian
sense? They then speak about ‘‘intentions’’, ‘‘mental/
emotional states’’, and also acknowledge applications such
as BCI spellers and restoration of mobility.
With BCIs, intentions can be inferred deductively inso-
far as the mind is biologically associated with the brain,
and that what is read is from the same mind, and realised
digitally, algorithmically, and mechanically. Moreover,
there is inductive reasoning needed in how the signals are
decoded and applied. Leaving that aside, all should unite in
a move against substance dualism’s separation of mind and
body.
5.3 Anticipating intentions and BCIs
Another factor which questions dualism is the aim to
anticipate intentions. BCIs have predicted movement
intentions (Niazi et al. 2012), detected from the brain’s
cortical potentials generated during motor imagining of
ankle flexing, and to trigger corresponding interventions in
real time with electrical stimulation. This raises the pos-
sibility that peripheral stimulation together with patient
rehabilitative treatment could result in better behavioural
outcomes. As well, in stroke patients, an algorithm has
reliably detected an individual’s intent of generating a
shoulder or elbow motor task and therefore may offer a
reliable control for neural prostheses (Zhou et al. 2009).
Poel et al. (2012) propose changing the concept of BCI
as an actor (input control) to BCI as an intelligent sensor
(monitor), designed to represent spontaneous changes in
users to bring about intelligent adaptations. They become
sensors which read passive signals from the nervous system
with no intentional altering of brain activity. The inferred
states of the user are adapted to human–computer interac-
tion, human–robot interaction, or human–human interac-
tion (p. 379).
Intentions to move are expressed and executed effort-
lessly for able-bodied persons; for the disabled, brain
AI & Soc
123
information is translatable by BCIs and transformable into
assistive actions. Motor commands from the brain are
extracted and converted into instructions for a mechanical
actuator, e.g. robotic manipulator (Nicolelis and Lebedev
2009). Whereas the normal response to intentions is neu-
romuscular, in BCIs, the computer, neuroprosthesis, or
wheelchair would respond. The originating intention is
normally a property of mind, not just brain alone. The
status of BCIs is evident in the breakdown into steps.
Inferences can be drawn, which are more intelligent than
those elicited from biometric identity technology (Al-
Hudhud et al. 2014) such as fingerprint recognition and
retina scanning.
Yet others recognise that there are fundamental ques-
tions raised about the nature of human intentions (Baldwin
and Baird 2001); intention is not a simple concept
(Mazzone 2011). The LIS/BCI user can be understood as a
subject, an author of actions, an agent, and a controller,
which extends even to legal and moral responsibility
(Grubler 2011; Haselager 2013). It reaches deep into the
mind, to anticipate intentions to move. Perhaps in future,
there will be BCI applications for other cognitive functions.
6 Animals, robots, intentions
Other animals apart from humans have been successfully
used in BCI development, along with robots. But their
obvious experimental and clinical value has other impli-
cations which merit some discussion. Here are some salient
aspects, again with dualism in view.
6.1 Animals and goal-directed behaviour
The philosophical notion of ‘‘intentionality’’ according to
Wellman et al. (2009) includes goal-directed action, but
also a distinguishing type of subjective orientation of
beings to the world, like intentional experience. Intention
understanding emerges early in human development; yet
overlapping intention understandings, incorporating agents
as intentional actors and intentional experiencers are found
in primates in more limited ways.
Monkeys implanted directly with microelectrodes in
their brains were trained to use brain signals to control a
robotic arm to feed itself (Velliste 2008). The monkeys
underwent training to operate the robotic arm using a
joystick. Nonhuman primates have also learned to use a
brain-controlled cursor or device (Williams et al. 2013).
For example, in a computer screen task, macaque monkeys
over several days successfully discovered how to control a
cursor by modulating brain signals (Taylor et al. 2002).
Rats were trained by Chapin et al. (1999) to obtain water
by pressing down on a spring-loaded lever to proportionally
move a robot arm to a water dropper. When released, the
robot arm/water drop moved to the rest position to the rat;
thus, water was transferred to the mouth. Next, the rats were
surgically implanted with recording electrodes in their
brains, another direct contact method. These rats’ brain
signals were then used to position the robot arm and obtain
water.
Zhang et al. (2011) used synchronous recording and
analysing systems to extract rat brain activities in primary
motor cortex and translate them into control signals. They
describe this, ‘‘so rat [sic.] can implement its intention to
control external robotic lever directly for water rewards’’
(p. 886). Something motivates animals to initiate goal-
directed behaviours that initiation could be termed an
intention.
In such neural control of motor prosthetics in animals, it
can be seen that the neural signals in the human motor
cortex are akin to nonhuman primates (Scherberger 2009).
This is significant because it demonstrates that electro-
physiological principles of motor control and decoding in
nonhuman primates are transferable to humans (p. 629).
Animal goal-directed behaviour can be modelled using
BCIs based on common physical laws and mechanisms.
The natural basis for human intentions is therefore bio-
logical evolution.
Associated concepts include being goal-directed,
achieving desired outcomes, planning, and implementation
(Papies et al. 2009). There are studies, which do not
involve mental terminology. Neuroscientific research using
animals and humans show that brain mechanisms known as
mirror neurons are the basis for understanding the inten-
tions of other people through their actions, even in context
(Nakahara and Miyashita 2005). Also, some aspects of
action, which usually rely on intentions, e.g. action plan-
ning and control, can be explained by neural processes that
do not depend on the characteristics of intentions (Uithol
et al. 2014). Similarly, Gollwitzer (1993) identifies three
traditional perspectives on intentions: as acts of willing, as
needs, and as best predictors of behaviour. Conceivably,
the notion of needs can be most readily applied to animals.
Thus, even if not called intentions, physical connections
still exist.
It could be asked: Do direct contact BCIs mean clearer
intentions, even though it needs training? In the case of
animals and direct brain contact, it is shown that simpler
biological goals such as feeding can be achieved. In
humans, the indirect method is most popular and ethical.
While the interpretation is filtered through another layer,
the intentions can be derived from the BCI application.
Where communication is possible with LIS patients, e.g. a
speller, then self-reports confirm the intentions expressed
in actions. BCIs also demystify the difference of mind and
body. Furthermore, the stronger inferences from direct
AI & Soc
123
BCIs in animal research can be arguably transferred to
humans because of deep evolutionary links and experi-
mental data revealing how neural activity in the human
motor cortex are similar to nonhuman primates. All this is
rather impossible if dualism were true.
6.2 BCI robots
The operation of robots and BCIs adds another strand. The
dream of traditional AI is to build a conscious robot who
says in first-person perspective ‘‘I, robot’’—also the name
of a science-fiction movie (Coeckelbergh 2011). There is
cyborg intentionality which extends beyond human inten-
tionality expressed ‘‘through’’ technology (Verbeek 2008).
But it is not menacing AI like the computer system Hal in
the film 2001: A Space Odyssey (Durkin 2003).
Nonetheless, here, the idea is not robot intentionality but
as a responder to human signals. Robots have been vari-
ously defined and designed, e.g. an independent sensori-
motor system with performance capacities (Harnad and
Scherzer 2008); a ‘‘machine that senses, thinks, and acts’’
(Bekey 2005, p. 2); companions or caregivers (Pearson and
Borenstein 2013). Humanoid robots are mechanical-look-
ing robots, whereas androids are humanlike robots (Mac-
Dorman and Ishiguro 2006).
Robots used in BCI’s act can be regarded as means for
action, rather than machine subjects with intentions. Some
examples are noninvasive EEG-BCIs where users can
control a robotic arm or robot end effector (Ianez et al.
2010); an interface enabling the user to select arbitrary
words by thought, send them to a remote robotic arm
through the Internet, and the robot writes the word on a
whiteboard in real time (Perez-Marcos et al. 2011). For
invasive ECoG-based BMI systems, there are implantable
wireless systems, to give voluntary control over the open-
ing and grasping of a robot hand (Hirata et al. 2012).
However, between being a means for action and being a
machine subject with its own intentions, there are examples
where robots used for BCI are more than a mere instru-
ment, but not a machine subject.4 One situation is where
there is feedback but more ‘‘agency’’ in the BCI. Current
BCIs have difficulties, e.g. long previous training times are
being addressed by new methods which consider address
context and subject specificities, e.g. adaptive detection of
SSVEPs (Fernandez-Vargas 2013). An assisted closed-
loop protocol can increase BCI efficiency by giving both
the system and the subject online information, which helps
them to achieve the BCI goal in their interaction.
In a bolder direction, Sanchez et al. (2009) speak of a
new relationship between humans and machines, a bidi-
rectional bond between tools and users. The feature lacking
in robotics is a paradigm for co-adaptation with humans.
Using rats, Sanchez et al. designed co-adaptive brain
machine interface (CABMI) experimental model. The rat
must manoeuvre a five degree-of-freedom robotic arm
using visual feedback to reach a set target and gain a water
reward. The experimental paradigm is aligned with the task
of a paralysed patient using a prosthetic for reaching motor
control. The paradigm involves the computational agent
and the rat knowing the goals in the environment: the agent
through programming, the rat through training. But each
must co-adapt. There is also BMI architecture where neural
interfaces adapt to new environments and use reinforce-
ment learning (DiGiovanna et al. 2009). This involves
machine learning where it discovers which actions result in
the most reward via trial and error.
In contrast, using monkey research, Fan et al. (2014)
point to another method, biomimetic decoders, which use
algorithms that mimic the neural-to-kinematic biological
mapping very closely, so that the need for behavioural
learning and adaptation is minimised, e.g. in native arm
movements. Moreover, the neural-to-kinematics descrip-
tion or mapping does not vary greatly from experimental
observations of neural-to-kinematic relationships in the
pertinent workspaces. But Fan et al. contend that by
ongoing use and understanding of biomimetic and adaptive
strategies, BMI performance ought to improve towards
clinical viability.
All this shows the physical connections between robots
and BCIs; the coadaptation and biomimetic models
underpin this. If an animal can thus interact and co-adapt
with the machine, then that is one more step towards a
strongly materialist vision of mind and body.
6.3 Robots and anticipating intentions
Recognising intentions is difficult in circumstances where a
robot must learn from or collaborate with a human (Kelley
et al. 2014). While simulation or perspective-taking can
equip help robots work with people on joint tasks, another
focus is on recognition whereby the human is not actively
attempting to assist the robot learn. The intent-recognition
system depends on the robot’s sensors and actuators to
obtain information about the world and exercise control.
Kelley R et al.’s robots use its own sensory-motor capa-
bilities and make inferences using its previous experience
of its spatiotemporal context.
Similarly, BCI robots anticipate intentions in a closer
environment. In stroke rehabilitation, a vision system can
track and locate objects in space; an eye tracker can select
targets; and a BCI used to control a robotic upper limb
4 The existence of a range of applications and indeed status between
BCI being a ‘‘‘means’’ and a ‘‘machine subject’’ was highlighted by
an anonymous reviewer together with a helpful suggestion.
AI & Soc
123
exoskeleton. Reaching and grasping objects are facilitated
by online capturing of intentions of movement (intention-
driven assistance) (Frisoli et al. 2012). BCIs can also
combine with robot-assisted physical therapy to provide
haptic feedback as a promising pathway in rehabilitation of
patients (Gomez-Rodriguez et al. 2011). A robotic arm
promotes online decoding of the subject’s intention to
move the arm.
Intentions are also anticipated in an application which
drives the functional compensation of upper limb tremors,
tremor being the most common movement disorder (Rocon
et al. 2010). A soft wearable robot applies biomechanical
loads via FES of muscles. The BCI assesses generation,
transmission, and execution of voluntary and tremorous
movements using EEG, electromyography, and inertial
sensors.
Overall, intentions are qualities of minds. The fact that
they can be acted on, interactively, demonstrates mind–
brain interactions but the mapping is not purely 1:1. Yet,
there is still a subject who cannot be reduced and lose
dignity. The use of robots and animals illustrates the reality
of intentions and anticipating intentions, as nondualist
readable and executable phenomena of minds and the role
of machines to assist human beings in disability and health
care settings.
7 Other BCIs
Nowadays, the use of computers and machines feature in
areas which have diversified BCI applications beyond
impaired persons. Transparent (Ducao et al. 2012) is an
office window that adjusts its opacity to assist the user,
wearing an EEG headset. BCIs have been extended by
combining them with other intelligent sensors and systems,
and communication devices, whereby users can commu-
nicate intuitively in various circumstances, goals, and
times, e.g. when fatigued (Allison et al. 2012).
In BCI games (Gurkok et al. 2013), although indirect
EEG is the interface, the intentions are clear, e.g. to select a
direction as in aiming a gun for personal shooting. More-
over, controlling a device using brain activity can facilitate
faster reaction times, where intentions of movements are
recognised as movement preparations before the initiation
of action (Krepki et al. 2007). BCIs can also bypass the
conduction delays from brain to muscles, therefore
affording more speed in initiating actions in competitive
applications with two players (p. 87). This underlines again
the physicalist interconnection between mind, brain, and
body.
To LIS patients, heightening game playing experiences
may seem foreign as it might to the pioneers of BCIs who
started out with rehabilitation and communication goals.
But behind these entertainments BCIs, is a healthy user
with recognisable intentions to win using his/her brain,
mind, and body.
8 Conclusions
Problems posed in robotics and machine intelligence can
be examined and compared with other philosophical
problems (Molyneux 2012). However, for BCIs, personal
identity remains imperative (Lucivero and Tamburrini
2008). In this article, we turned to BCIs and their patient-
users to build a technological case against dualism while
keeping a person-centred outlook. Notwithstanding, there
is a drift from the original purpose of assisting the disabled
(Bonnet et al. 2013).
BCIs confront dualism with proven mechanisms of
interaction between mind, brain, and body. Although there
is apprehension about reading minds and to ‘‘enter other
minds’’ (Evers and Sigman 2013, p. 891), BCIs have a
genuine supportive purpose. The technology is designed as
a brain computer interface which does not invade people’s
minds. To the extent, BCIs exist due to neural engineering,
AI, and proof-of-concept development; these are robust
affirmations for mind–brain and mind–body connections.
Reciprocally, for eliminative materialists who hold that
commonsense psychological phenomena, e.g. introspec-
tion, are defective and will eventually be displaced by
neuroscience (Churchland 1981), BCIs hold up mental
states, such as intentions, even in animals, wherefrom goals
can be subsequently digitised and enacted externally.
Mass adoption of BCIs in society needs careful moni-
toring (Narayanan 2013), and dehumanisation is always a
risk. A trend towards depersonalisation could begin if
physicalists succeed in arguing that the decoding of goal-
directed thoughts entails a necessary and sufficient reduc-
tion in mind to brain (Bickle 2001). While it is asked
whether mental properties are reducible to physical prop-
erties (Schneider 2013), nevertheless, physicalism has been
besieged with a revival of interest in dualism (BonJour
2010). Indeed, materialism can also be summoned as evi-
dence for dualism (Barrett 2006).
We could call the human concerns antireductionism.
They see approaches which regard the brain as a processor
of information where meaning has been preassigned to,
instead of being constructed by the organism (Marshall
2009). There is scepticism that brain science can ‘‘read
off’’ information from descriptions of neuronal activity and
structure (Hatfield 2000). Or that we are being transformed
into the ‘‘cerebral subject’’ (Vidal 2009) where the human
being is understood as the property of ‘‘brainhood’’,
becoming an organic essentialism. Others name it as
‘‘neuroreductionism’’ (Glannon 2009), which posits a
AI & Soc
123
monistic concept claiming that mind is a function of the
brain (Tretter 2010).
The antithesis of antireductionism is a possible position,
which challenges dualism and offers a plausible resolution of
mind–brain problems. The proposal is that BCIs provides
grounds for thinking that ‘‘being in the mind of a person’’ is
equivalent to ‘‘being in the brain of a person’’ (see Footnote 3).
Searle (1992) acknowledges the computer model of the mind
is where the brain is the hardware of a computer system and
the mind is the program. But he argues against the claim that
the mind is a computer program (Searle 1984). In BCIs, the
computer is external. The mind is accessible via a BCI which
creates a bioelectronic opening to the mind through the brain.
The interrelationship of mind/brain $ actuates the body is
changed with BCIs to something like mind/brain $BCI ? bypass unresponsive body to actuator (prosthetic,
computer screen, robot….).
Undoubtedly, a new BCI philosophy of mind and brain will
emerge as knowledge advances. If technology finds fresh
ways to capture, interpret, and harness brain activity; then,
other workings of the mind may become reachable, e.g.
memory, besides communications and movement intentions,
deliberate and anticipated. Co-adaptive and mimetic BCI
models may lead researchers to a greater fusion of human and
machine beyond the current bionic ear and eye implant
technologies. There could be tendency towards transhuman-
ism, or threatening procedures as covert wireless mind read-
ing. It is the overarching technical vision and regulatory
framework which society needs to monitor, so that human
freedom and dignity are meticulously upheld.
Flanagan (2005) finds it ‘‘ironic that the ‘locus classicus’
of contemporary philosophy of mind argued in a sense that
there really is no such thing as ‘mind’ traditionally under-
stood’’ (p. 605) The BCI philosophy seems to point in that
direction. That ‘‘being in the mind of a person’’ can be
equated to ‘‘being in the brain of a person’’. The dualist
divide between mind and body is looking dissolvable.
Whether a new BCI philosophy of mind and brain is extreme
will be determined by the future. However, the intentions to
move, to communicate using language, to play games, to
compete, and even to paint are dimensions of the mind,
which depend on the brain. With or without computers, they
belong to a human person who deserves protection.
Acknowledgments I would like to acknowledge the thoughtful
comments, encouragement, and insightful suggestions of the two
anonymous reviewers which assisted in the preparation of this
manuscript.
References
Abramson D (2011) Philosophy of mind is (in part) philosophy of
computer science. Minds Mach 21:203–219
Al-Hudhud G et al (2014) Using brain signals patterns for biomet-
ric identity verification systems. Comput Hum Behav
31:224–229
Allison BZ et al (2012) Toward smarter BCIs: extending BCIs
through hybridization and intelligent control. J Neural Eng 9.
doi:10.1088/1741-2560/9/1/013001
Andersen RA et al (2010) Cognitive neural prosthetics. Annu Rev
Psychol 61:169–190
Anonymous (2013) My life with Parkinson’s. Nature 503:29–30
Aranyosi I (2011) A new argument for mind–brain identity. Br J
Philos Sci 62:489–517
Baldwin DA, Baird JA (2001) Discerning intentions in dynamic
human action. Trends Cogn Sci 5:171–178
Barrett JA (2006) A quantum-mechanical argument for mind–body
dualism. Erkenntnis 65:97–115
Beauregard M (2007) Mind does really matter: evidence from
neuroimaging studies of emotional self-regulation, psychother-
apy, and placebo effect. Prog Neurobiol 81:218–236
Bekey GA (2005) Autonomous robots, from biological inspiration to
implementation and control. MIT Press, Cambridge
Belda-Lois J-M et al (2011) Rehabilitation of gait after stroke: a
review towards a top-down approach. J NeuroEng Rehabil 8.
doi:10.1186/1743-0003-8-66
Bell CJ et al (2008) Control of a humanoid robot by a noninvasive
brain–computer interface in humans. J Neural Eng 5:214–220
Bickle J (2001) Understanding neural complexity: a role for
reduction. Minds Mach 11:467–481
Birbaumer N (2006) Breaking the silence: brain–computer interfaces
(BCI) for communication and motor control. Psychophysiology
43:517–532
Birbaumer N, Cohen LG (2007) Brain–computer interfaces: commu-
nication and restoration of movement in paralysis. J Physiol
579:621–636
BonJour L (2010) Against materialism. In: Koons RC, Bealer G (eds)
The waning of materialism. Oxford University Press, Oxford,
pp 3–23
Bonnet L, Lotte F, Lecuyer A (2013) Two brains, one game: design
and evaluation of a multiuser BCI video game based on motor
imagery. IEEE Trans Comput Intell AI Games 5:185–198
Brack A, Trouble M (2010) Defining life: connecting robotics and
chemistry. Orig Life Evol Biosph 40:131–136
Brumberg JS, Guenther FH (2010) Development of speech prosthe-
ses: current status and recent advances. Expert Rev Med Devices
7:667–679
Brumberg JS et al (2010) Brain–computer interfaces for speech
communication. Speech Commun 52:367–379
Brunner P et al (2011) Current trends in hardware and software for
brain–computer interfaces (BCIs). J Neural Eng 8. doi:10.1088/
1741-2560/8/2/025001
Campbell CM, Edwards RR (2009) Mind–body interactions in pain:
the neurophysiology of anxious and catastrophic pain-related
thoughts. Transl Res 153:97–101
Campbell M, Hoane J Jr, Hsu H-f (2002) Deep blue. AI 134:57–83
Chai R et al (2012) Mental non-motor imagery tasks classifications of
brain computer interface for wheelchair commands using genetic
algorithm-based neural network. Proc2012 Int Joint Conf Neural
Netw, 10–15 June 2012, doi:10.1109/IJCNN.2012.6252499
Chapin JK et al (1999) Real-time control of a robot arm using
simultaneously recorded neurons in the motor cortex. Nat
Neurosci 2:664–670
Churchland PM (1981) Eliminative materialism and the propositional
attitudes. J Philos 78:67–90
Coeckelbergh M (2011) You, robot: on the linguistic construction of
artificial others. AI Soc 26:61–69
de Kamps M (2012) Towards truly human-level intelligence in
artificial applications. Cogn Syst Res 14:1–9
AI & Soc
123
De Massari D et al (2013) Brain communication in the locked-in
state. Brain 136:1989–2000
DiGiovanna J et al (2009) Brain–machine interface via reinforcement
learning. IEEE Trans Biomed Eng 56:54–64
Donoghue JP (2008) Bridging the brain to the world: a perspective on
neural interface systems. Neuron 60:511–521
Ducao A, Tseng T, von Kapri A (2012) Transparent: brain computer
interface and social architecture. Proc SIGGRAPH’12 ACM
SIGGRAPH 2012 Posters. doi:10.1145/2342896.2342929
Dumit J (2004) Picturing personhood: brain scans and biomedical
identity. Princeton University Press, Princeton
Durkin J (2003) Man and machine: I wonder if we can coexist. AI Soc
17:383–390
Edelman GM (1992) Bright air, brilliant fire: on the matter of the
mind. Basic Books, New York
Ekandem JI et al (2012) Evaluating the ergonomics of BCI devices
for research and experimentation. Ergonomics 55:592–598
Engel AK et al (2005) Invasive recordings from the human brain:
clinical insights and beyond. Nat Rev Neurosci 6:35–47
Evers K, Sigman M (2013) Possibilities and limits of mind-
reading: a neurophilosophical perspective. Conscious Cogn
22:887–897
Fan JM et al (2014) Intention estimation in brain–machine interfaces.
J Neural Eng. doi:10.1088/1741-2560/11/1/016004
Farwell LA, Donchin E (1988) Talking off the top of your head:
toward a mental prosthesis utilizing event-related brain poten-
tials. Electroencephalogr Clin Neurophysiol 70:510–523
Fernandez-Vargas J (2013) Assisted closed-loop optimization of
SSVEP-BCI efficiency. Front Neural Circuits 7:27. doi:10.3389/
fncir.2013.00027
Fingelkurts AA, Fingelkurts AA, Neves CFH (2010) Natural world
physical, brain operational, and mind phenomenal space–time.
Phys Life Rev 7:195–249
Flanagan O (2005) History of the philosophy of mind. In: Honderich
T (ed) The Oxford companion to philosophy, new edn. Oxford
University Press, Oxford, pp 603–607
Frisoli A et al (2012) A new gaze-BCI-driven control of an upper
limb exoskeleton for rehabilitation in real-world tasks. IEEE
Trans Syst Man Cybern C Appl Rev 42:1169–1179
Gergondet P et al (2011) Using brain–computer interface to steer a
humanoid robot. Proc 2011 IEEE Int Conf Robotics Biomim
(ROBIO) 192–197
Glannon W (2009) Our brains are not us. Bioethics 23:321–329
Gollwitzer PM (1993) Goal achievement: the role of intentions. Euro
Rev Soc Psychol 4:141–185
Gomez-Rodriguez M et al (2011) Closing the sensorimotor loop:
haptic feedback facilitates decoding of motor imagery. J Neural
Eng 8. doi:10.1088/1741-2560/8/3/036005
Green AM, Kalaska JF (2011) Learning to move machines with the
mind. Trends Neurosci 34:61–75
Grubler G (2011) Beyond the responsibility gap. Discussion note on
responsibility and liability in the use of brain–computer inter-
faces. AI Soc 26:377–382
Gurkok H, Nijholt A (2012) Brain–computer interfaces for multi-
modal interaction: a survey and principles. Int J Hum Comput
Interact 28:292–307
Gurkok H et al (2013) Evaluating a multi-player brain–computer
interface game: challenge versus co-experience. Entertain Com-
put 4:195–203
Haig AJ, Katz RT, Sahgal V (1987) Mortality and complications of
the locked-in syndrome. Arch Phys Med Rehabil 68:24–27
Hainline B (2011) Neuropathic pain: mind–body considerations.
Neurol Clin 29:19–33
Harnad S, Scherzer P (2008) First, scale up to the robotic Turing test,
then worry about feeling. AI Med 44:83–89
Hasan BAS, Gan JO (2012) Hangman BCI: an unsupervised adaptive
self-paced brain–computer interface for playing games. Comput
Biol Med 42:598–606
Haselager P (2013) Did I do that? Brain–computer interfacing and the
sense of agency. Minds Mach 23:405–418
Hatfield G (2000) The brain’s ‘new’ science: psychology, neuro-
physiology, and constraint. Philos Sci 67:S388–S403
Hatsopoulos HG, Donoghue JP (2009) The science of neural interface
systems. Annu Rev Neurosci 32:249–266
Hirata M et al (2012) Motor restoration based on the brain–machine
interface using brain surface electrodes: real-time robot control
and a fully implantable wireless system. Adv Robot 26:399–408
Hochberg LR et al (2012) Reach and grasp by people with tetraplegia
using a neurally controlled robotic arm. Nature 485:372–375
Hustvedt S (2013) Philosophy matters in brain matters. Seizure
22:169–173
Ianez E et al (2010) Mental tasks-based brain–robot interface. Robot
Auton Syst 58:1238–1245
Kaitaro T (2004) Brain–mind identities in dualism and materialism: a
historical perspective. Stud Hist Philos Biol Biomed Sci
35:627–645
Kelley R et al (2014) Intent recognition for human–robot interaction.
In: Sukthankar G et al (eds) Plan, activity, and intent recognition:
theory and practice. Morgan Kaufmann, Waltham, pp 343–365
Kendler KS, Campbell J (2009) Interventionist causal models in
psychiatry: repositioning the mind–body problem. Psychol Med
39:881–887
Kihlstrom JH (2008) Placebo: feeling better, getting better, and the
problems of mind and body. McGill J Med 11:212–213
Kim J (1998) The mind–body problem after fifty years. In: O’Hear A
(ed) Current issues in the philosophy of mind. Cambridge
University Press, Cambridge, pp 3–21
Kim H-Y (2008) Locke and the mind–body problem: an interpretation
of his agnosticism. Philoshy 83:439–458
Krepki R et al (2007) The Berlin brain–computer interface (BBCI)—
towards a new communication channel for online control in
gaming applications. Multimed Tools Appl 33:73–90
Kron SS (2012) The mind body problem. Anesthesiology
116:219–221
Kubler A et al (1999) The thought translation device: a neurophysi-
ological approach to communication in total motor paralysis.
Exp Brain Res 124:223–232
Kwok R (2013) Neuroprosthetics: once more, with feeling. Nature
497:176–178
Kyselo M (2013) Locked-in syndrome and BCI—towards an enactive
approach to the self. Neuroethics 6:579–591
Laureys S et al (2005) The locked-in syndrome: what is it like to be
conscious but paralyzed and voiceless? Prog Brain Res
150:495–511
Lebedev MA, Nicolelis MAL (2006) Brain–machine interfaces: past,
present and future. Trends Neurosci 29:536–546
Lee B, Liu CY, Apuzzo MLJ (2013) A primer on brain–machine
interfaces, concepts, and technology: a key element in the future
of functional neurorestoration. World Neurosurg 79:457–471
Lin C-T et al (2010) Review of wireless and wearable electroen-
cephalogram systems and brain–computer interfaces—a mini-
review. Gerontology 56:112–119
Lopes DM (2010) A philosophy of computer art. Routledge, Oxford
Lucivero F, Tamburrini G (2008) Ethical monitoring of brain–
machine interfaces. AI Soc 22:449–460
Lule D et al (2009) Life can be worth living in locked-in syndrome.
Prog Brain Res 177:339–351
Lule D et al (2013) Probing command following in patients with
disorders of consciousness using a brain–computer interface.
Clin Neurophysiol 124:101–106
AI & Soc
123
Lycan WG (2009) Giving dualism its due. Australas J Philos
87:551–563
MacDorman KF, Ishiguro H (2006) The uncanny advantage of using
androids in cognitive and social science research. Interact Stud
7:297–337
Marshall PJ (2009) Relating psychology and neuroscience: taking up
the challenges. Perspect Psychol Sci 4:113–125
Mazzone M (2011) Intentions as complex entities. Rev Philos Psychol
2:767–783
McFarland D (2008) Guilty robots, happy dogs: the question of alien
minds. Oxford University Press, Oxford
McGinn C (1989) Can we solve the mind–body problem? Mind
98:349–366
Molyneux B (2012) How the problem of consciousness could emerge
in robots. Minds Mach 22:277–297
Morris K (2004) Mind moves onscreen: brain–computer interface
comes to trial. Lancet Neurol 3:329
Murguialday R et al (2011) Transition from the locked into the
completely locked-in state: a physiological analysis. Clin
Neurophysiol 122:925–933
Nagasawa Y (2012) Infinite decomposability and the mind–body
problem. Am Philos Q 49:357–367
Nagel T (1974) What is it like to be a bat? Philos Rev 83:435–450
Nakahara K, Miyashita Y (2005) Understanding intentions: through
the looking glass. Science 308:644–645
Narayanan A (2013) Society under threat… but not from AI. AI Soc
28:87–94
Niazi IK et al (2012) Peripheral electrical stimulation triggered by
self-paced detection of motor intention enhances motor
evoked potentials. IEEE Trans Neural Syst Rehabil Eng
20:595–604
Nicolas-Alonso LF, Gomez-Gil J (2012) Brain computer interfaces, a
review. Sensors 12:1211–1279
Nicolelis MAL, Lebedev MA (2009) Principles of neural ensemble
physiology underlying the operation of brain–machine inter-
faces. Nature Rev Neurosci 10:530–540
Ortner R et al (2011) An SSVEP BCI to control a hand orthosis for
persons with tetraplegia. IEEE Trans Neural Syst Rehabil Eng
19:1–5
Papies EK et al (2009) Planning is for doing: implementation
intentions go beyond the mere creation of goal-directed associ-
ations. J Exp Soc Psychol 45:1148–1151
Pearson Y, Borenstein J (2013) The intervention of robot caregivers
and the cultivation of children’s capability to play. Sci Eng
Ethics 19:123–137
Perez-Marcos D, Buitrago JA, Velasquez FDG (2011) Writing
through a robot: a proof of concept for a brain–machine
interface. Med Eng Phys 33:1314–1317
Pfurtscheller G et al (2003) ‘Thought’—control of functional
electrical stimulation to restore hand grasp in a patient with
tetraplegia. Neurosci Lett 351:33–36
Poel M et al (2012) Brain computer interfaces as intelligent sensors
for enhancing human–computer interaction. In: Proceedings of
14th ACM international conference multimodal interact, 22–26
Oct 2012, Santa Monica, CA, 379–382
Pribram KH (1998) Thoughts on the meaning of brain electrical
activity. Int J Psychol 33:213–225
Rockwell WT (2007) Neither brain nor ghost, a nondualist alternative
to the mind–brain identity theory. The MIT Press, Cambridge
Rocon E et al (2010) Multimodal BCI-mediated FES suppression of
pathological tremor. 2010 Annu Int Conf IEEE Eng Med Biol
Soc (EMBC), 3337–3340
Rohm M et al (2013) Hybrid brain–computer interfaces and hybrid
neuroprostheses for restoration of upper limb functions in
individuals with high-level spinal cord injury. AI Med
59:133–142
Ropper AJ (2010) Cogito ergo sum by MRI. New Eng J Med
362:648–649
Rubinstein JT (2004) How cochlear implants encode speech. Curr
Opin Otolaryngol Head Neck Surg 12:444–448
Sanchez JC et al (2009) Exploiting co-adaptation for the design of
symbiotic neuroprosthetic assistants. Neural Netw 22:305–315
Sartenaer O (2013) Neither metaphysical dichotomy nor pure
identity: clarifying the emergentist creed. Stud Hist Philos Biol
Biomed Sci 44:365–373
Scherberger H (2009) Neural control of motor prostheses. Curr Opin
Neurobiol 19:629–633
Scherer R, Pfurtscheller G (2013) Thought-based interaction with the
physical world. Trends Cogn Sci 17:490–492
Scherer R et al (2013) Brain–computer interfacing: more than the sum
of its parts. Soft Comput 17:317–331
Schimmel P (2001) Mind over matter? I: philosophical aspects of the
mind–brain problem. Aust NZ J Psychiatry 35:481–487
Schneider S (2013) Non-reductive physicalism and the mind problem.
Nous 47:135–153
Searle JR (1980) Minds, brains, and programs. Behav Brain Sci
3:417–424
Searle JR (1984) Minds, brains and science, the 1984 Reith Lectures.
Harvard University Press, Cambridge
Searle JR (1992) The rediscovery of the mind. The MIT Press,
Cambridge
Sellers EW, Donchin E (2006) A P300-based brain–computer
interface: initial tests by ALS patients. Clin Neurophysiol
117:538–548
Smart JJC (1963) Materialism. J Philos 60:651–662
Solis J et al (2010) Development of the anthropomorphic saxophonist
robot WAS-1: mechanical design of the simulated organs and
implementation of air pressure feedback control. Adv Robot
24:629–650
Stoll J et al (2013) Pupil responses allow communication in locked-in
syndrome patients. Curr Biol 23:R647–R648
Tan L-F et al (2014) Effect of mindfulness meditation on brain–
computer interface performance. Conscious Cogn 23:12–21
Taylor DM, Tillery SIH, Schwartz AB (2002) Direct cortical control
of 3D neuroprosthetic devices. Science 296:1829–1832
Taylor DM, Tillery SI, Schwartz AB (2003) Information conveyed
through brain-control: cursor versus robot. IEEE Trans Neural
Syst Rehabil Eng 11:195–199
Thinnes-Elker F et al (2012) Intention concepts and brain–machine
interfacing. Front Psychol 3. doi:10.3389/fpsyg.2012.00455
Tretter F (2010) Philosophical aspects of neuropsychiatry. In: Tretter
F et al (eds) Systems biology in psychiatric research: from high-
throughput data to mathematical modelling. Wiley-Blackwell,
Weinheim, pp 3–25
Uithol S et al (2014) Why we may not find intentions in the brain.
Neuropsychologia 56:129–139
Ungar T, Knaak S (2013) The hidden medical logic of mental health
stigma. Aust NZ J Psychiatry 47:611–612
Velliste M (2008) Cortical control of a prosthetic arm for self-feeding.
Nature 53:1098–1101
Verbeek P-P (2008) Cyborg intentionality: rethinking the phenome-
nology of human–technology relations. Phenomenol Cognit Sci
7:387–395
Vidal F (2009) Brainhood, anthropological figure of modernity. Hist
Hum Sci 22:5–36
Wasserman EA (1993) Comparative cognition: beginning the second
century of the study of animal intelligence. Psychol Bull
113:211–228
AI & Soc
123
Weisberg DS et al (2008) The seductive allure of neuroscience
explanations. J Cogn Neurosci 20:470–477
Wellman HM et al (2009) Early intention understandings that are
common to primates predict children’s later theory of mind. Curr
Opin Neurobiol 19:57–62
Williams JJ et al (2013) Differentiating closed-loop cortical intention
from rest: building an asynchronous electrocorticographic BCI.
J Neural Eng 10. doi:10.1088/1741-2560/10/4/046001
Wolpaw JR et al (2000) Brain–computer interface technology: a
review of the first international meeting. IEEE Trans Rehabil
Eng 8:164–173
Wolpaw JR et al (2002) Brain–computer interfaces for communica-
tion and control. Clin Neurophysiol 113:767–791
Yu T et al (2012) Surfing the internet with a BCI mouse. J Neural Eng
9. doi:10.1088/1741-2560/9/3/036012
Zhang Q et al (2011) Building brain machine interfaces: from rat to
monkey. In: Proceedings of 2011 8th Asian Control Conference
(ASCC) Kaohsiung, Taiwan, May 15–18, 2011, pp 886–891
Zhou J et al (2009) EEG-based classification for elbow versus
shoulder torque intentions involving stroke subjects. Comput
Biol Med 39:443–452
Zickler CA et al (2013) Brain painting: usability testing according to
the user-centered design in end users with severe motor
paralysis. AI Med 59:99–110
AI & Soc
123