emotional machines presented by chittha ranjani kalluri

28
Emotional Machines Presented by Chittha Ranjani Kalluri

Upload: dennis-dalton

Post on 27-Dec-2015

214 views

Category:

Documents


0 download

TRANSCRIPT

Emotional Machines

Presented byChittha Ranjani Kalluri

Why Can’t… We have a thinking computer? A machine that performs about a million floating-

point operations per second understand the meaning of shapes?

We build a machine that learns from experience rather than simply repeat everything that has been programmed into it?

A computer be similar to a person?

The above are some of the questions facing computer designers and others who are constantly striving to build more and more ‘intelligent’ machines.

So, what’s intelligence?

According to en.wikipedia.org:

“Intelligence is a general mental capability that involves the ability to reason, plan, solve problems, think abstractly, comprehend ideas and language, and learn.”

What does this mean for current machines?

Definitely not that they’re not intelligent! Some amount of intelligence has to be

built in How can that be done? Designers looked closely at how humans

Behave Express themselves Process information Solve problems

Expressing ourselves

Body language Facial expressions Tone of voice Words we choose All of them vary based on situation What we implicitly convey - emotion

What is emotion?

In psychology and common use, emotion is the language of a person's internal state of being, normally based in or tied to their internal (physical) and external (social) sensory feeling. Love, hate, courage, fear, joy, and sadness can all be described in both psychological and physiological terms.

Do machines need emotion?

Machines of today don’t need emotion Machines of the future would need it to

Survive Interact with other machines and humans Learn Adapt to circumstances

Emotions are a basis for humans to do all the above

What is an emotional machine?

An intelligent machine that can recognize emotions and respond using emotions

Concept proposed by Marvin Minsky about a year ago in his book ‘The Emotion Machine’

Example: the WE-4RII (Waseda Eye No. 4 Refined II), being developed at the Waseda University, Japan

The WE-4RII Simulates six basic emotions

Happiness Fear Surprise Sadness Anger Disgust

Recognizes certain smells Detects certain types of touch Uses 3 personal computers for communication Still not as close to an emotional machine as we

would want

The WE-4RII

Happiness

Fear

The WE-4RII

Surprise

Sadness

The WE-4RII

Anger

Disgust

Do we want…

Maybe…

We’re not there…yet! So how do we get from to

Characteristics of multi-modal ELIZA

Based on message passing on blackboard

Input – user’s text string Output – sentences and facial displays Processing module consists of

NLP layer Emotional recognition layer

Constructs facial displays

NLP Layer

String converted to list of words by parser Spelling checked Abbreviations replaced Slang words and codes replaced with correct

ones Some words replaced with synonyms by

thesaurus Input matched with predefined patterns by

syntactic-semantic analyzer Longest matching string used to generate reply

NLP Layer

Repetition recognition ensures dialog does not enter loop

Rules written in AIML (Artificial Intelligence Markup Language)

Pragmatic analysis module checks reply against user preferences collected during conversation, and against goals and states of system

Emotion recognition layer

Emotive Lexicon Look-up Parser used to extract emotion eliciting factors

Bases it on a lexicon of words having emotional content

247 words, each with a natural number intensity

Overall emotional content of a string got from seven ‘thermometers’ which get updated when an emotionally rich word is found

Emotion recognition layer Emotive Labeled Memory Structure Extraction

labels each pattern and corresponding rules Two additional AIML tags used – ‘affect’ and

‘concern’: positive, negative, joking, normal Goal-Based Emotion Reasoning stores user’s

personal data Two knowledge bases to determine affective

state Stimulus response to user’s input Result of cognitive process of conversation to

convey reply

Preference rules - examples

IF (user is happy) AND (user asks question) AND (systems reply is sad) AND (situation type of user is not negative) AND (highest thermo is happy) THEN reaction is joy.

IF (user is sad) AND (systems reply is sad) AND (situation type of user is joking) AND (situation type of the system is negative) AND (maximum affective thermo is sad) THEN reply is resentment.

Facial display selection

Intensity of an emotion must exceed a threshold level before it can be expressed externally

If an emotion is active, system calculates values of all thermometers

Thermometer having highest value chosen as emotion

Intensity of emotion determines facial display

Other work in this area

Emotionally Oriented Programming (EOP) Allows programmers to explicitly represent and

reason about emotions Can build Emotional Machines (EMs) –

intelligent software agents with explicit programming constructs for concepts like mood, feelings, temperament

Inspiration: thoughts and feelings are intertwined Analysis of thought inspires feelings Feelings inspire creation of thoughts

Other work in this area

Emotionally Oriented Programming (EOP)

Other work in this area

Emotional Model for Intelligent Response (EMIR) Developed by Mindsystems, an Australian

company Includes simulations for feelings such as boredom! Methodology:

Looks at factors influencing a character Success at achieving goals Levels of a character’s control over situation

Compares this “state of mind” to a database of human responses mapped over time

Was in demo stage in 2002

Other work in this area

Emotionally Rich Man-machine Intelligent System (ERMIS) Aims to develop a prototype system for

human-computer interaction that can interpret its user’s attitude or emotional state, e.g., activation/ interest, boredom, and anger, in terms of their speech and/or their facial gestures and expressions

Adopted techniques include linguistic speech analysis, robust speech recognition, and facial expression analysis

Other work in this area

Net Environment for Embodied, Emotional Conversational Agents (NECA) Promotes concept of multi-modal

communication with animated synthetic personalities

Key challenge - the fruitful combination of different research strands including situation-based generation of natural language and speech and the modeling of emotions and personality.

Conclusion

The question is not whether intelligent machines can have emotions, but whether machines can be intelligent without any emotions.

Marvin Minsky, The Society of Mind

Bibliography Emotional machines – http://www.emotionalmachines.com Emotional machines – Do we want them? -

http://www.zdnet.com.au/news/communications/0,2000061791,20266134,00.htm

Marvin Minsky Home Page - http://web.media.mit.edu/~minsky/

Multi-Modal ELIZA - http://mmi.tudelft.nl/pub/siska/_TSD%20my_eliza.pdf

The WE4-RII - http://www.takanishi.mech.waseda.ac.jp/research/eyes/

Small Wonder - http://www.smallwonder.tv/ The HUMAINE Portal - http://emotion-research.net ERMIS - http://manolito.image.ece.ntua.gr/ermis NECA - http://www.oefai.at/NECA