seminar report

23
CONTEN TS Introduction Emotion mouse Emotion and computing Theory Result Manual and gaze input cascaded (magic) pointing Eye tracker Implementing magic pointing Artificial intelligent speech recognition Application

Upload: puneet-kaleraman

Post on 20-Nov-2014

399 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Seminar Report

CONTENTSIntroductionEmotion mouseEmotion and computingTheoryResultManual and gaze inputcascaded (magic) pointingEye trackerImplementing magic pointingArtificial intelligent speechrecognitionApplicationThe simple user interface trackerconclusionIntroduction :-Imagine yourself in a world where humans interact

Page 2: Seminar Report

with computers. You are sitting in front of yourpersonal computer that can listen, talk, or evenscream aloud. It has the ability to gatherinformation about you and interact with youthrough special techniques like facial recognition,speech recognition, etc. It can even understandyour emotions at the touch of the mouse. It verifiesyour identity, feels your presents, and startsinteracting with you .You ask the computer to dialto your friend at his office. It realizes the urgencyof the situation through the mouse, dials yourfriend at his office, and establishes a connection.The BLUE EYES technology aims atcreating computational machines that haveperceptual and sensory ability like those of humanbeings.Employing most modern video cameras andmicrophones to identifies the users actions throughthe use of imparted sensory abilities . The machincan understand what a user wants, where he islooking at, and even realize his physical oremotional states.

Page 3: Seminar Report

Emotion mouse:-One goal of human computer interaction (HCI) is to make anadaptive, smart computer system. This type of project couldpossibly include gesture recognition, facial recognition, eyetracking, speech recognition, etc. Another non-invasive way toobtain information about a person is through touch. People usetheir computers to obtain, store and manipulate data using theircomputer. In order to start creating smart computers, thecomputer must start gaining information about the user. Ourproposed method for gaining user information through touch isvia a computer input device, the mouse. From the physiologicaldata obtained from the user, an emotional state may bedetermined which would then be related to the task the user iscurrently doing on the computer. Over a period of time, a usermodel will be built in order to gain a sense of the user's

Page 4: Seminar Report

personality. The scope of the project is to have the computeradapt to the user in order to create a better working environmentwhere the user is more productive. The first steps towardsrealizing this goal are described here.Emotion and computing:-Rosalind Picard (1997) describes why emotions are important tothe computing community. There are two aspects of affectivecomputing: giving the computer the ability to detect emotionsand giving the computer the ability to express emotions. Notonly are emotions crucial for rational decision making.butemotion detection is an important step to an adaptive computersystem. An adaptive, smart computer system has been drivingour efforts to detect a person’s emotional state. By matching aperson’s emotional state and the context of the expressedemotion, over a period of time the person’s personality is beingexhibited. Therefore, by giving the computer a longitudinal

Page 5: Seminar Report

understanding of the emotional state of its user, the computercould adapt a working style which fits with its user’s personality.The result of this collaboration could increase productivity forthe user. One way of gaining information from a user nonintrusivelyis by video. Cameras have been used to detect aperson’s emotional state. We have explored gaining informationthrough touch. One obvious place to put sensors is on themouse.Theory:-Based on Paul Ekman’s facial expression work, we see acorrelation between a person’s emotional state and a person’sphysiological measurements. Selected works from Ekman andothers on measuring facial behaviors describe Ekman’s FacialAction Coding System (Ekman and Rosenberg, 1997). One ofhis experiments involved participants attached to devices torecord certain measurements including pulse, galvanic skin

Page 6: Seminar Report

response (GSR), temperature, somatic movement and bloodpressure. He then recorded the measurements as the participantswere instructed to mimic facial expressions which correspondedto the six basic emotions. He defined the six basic emotions asanger, fear, sadness, disgust, joy and surprise. From this work,Dryer (1993) determined how physiological measures could beused to distinguish various emotional states.The measures taken were GSR, heart rate, skin temperature andgeneral somatic activity (GSA). These data were then subject totwo analyses. For the first analysis, a multidimensional scaling(MDS) procedure was used to determine the dimensionality ofthe data.Result:-The data for each subject consisted of scores for fourphysiological assessments [GSA, GSR, pulse, and skintemperature, for each of the six emotions (anger, disgust, fear,

Page 7: Seminar Report

happiness, sadness, and surprise)] across the five minutebaseline and test sessions. GSA data was sampled 80 times persecond, GSR and temperature were reported approximately 3-4times per second and pulse was recorded as a beat was detected,approximately 1 time per second. To account for individualvariance in physiology, we calculated the difference between thebaseline and test scores. Scores that differed by more than oneand a half standard deviations from the mean were treated asmissing. By this criterion, twelve score were removed from theanalysis.The results show the theory behind the Emotion mouse work isfundamentally sound. The physiological measurements werecorrelated to emotions using a correlation model. Thecorrelation model is derived from a calibration process in whicha baseline attribute-to emotion correlation is rendered based on

Page 8: Seminar Report

statistical analysis of calibration signals generated by usershaving emotions that are measured or otherwise known atcalibration time.Manual and gaze input cascaded(magic) pointing:-This work explores a new direction in utilizing eyegaze for computer input. Gaze tracking has longbeen considered as an alternative or potentiallysuperior pointing method for computer input. Webelieve that many fundamental limitations existwith traditional gaze pointing. In particular, it isunnatural to overload a perceptual channel such asvision with a motor control task. We thereforepropose an alternative approach, dubbed MAGIC(Manual And Gaze Input Cascaded) pointing. Withsuch an approach, pointing appears to the user tobe a manual task, used for fine manipulation and

Page 9: Seminar Report

selection. However, a large portion of the cursormovement is eliminated by warping the cursor tothe eye gaze area, which encompasses the target.Two specific MAGIC pointing techniques, oneconservative and one liberal, were designed,analyzed, and implemented with an eye tracker wedeveloped. They were then tested in a pilot study.This early stage exploration showed that theMAGIC pointing techniques might offer manyadvantages, including reduced physical effort andfatigue as compared to traditional manual pointing,greater accuracy and naturalness than traditionalgaze pointing, and possibly faster speed thanmanual pointing. In our view, there are twofundamental shortcomings to the existing gazepointing techniques, regardless of the maturity ofeye tracking technology. First, given the one-degreesize of the fovea and the subconscious jitterymotions that the eyes constantly produce, eye

Page 10: Seminar Report

gaze is not precise enough to operate UI widgetssuch as scrollbars, hyperlinks, and slider handlesSecond, and perhaps more importantly, the eye, asone of our primary perceptual devices, has notevolved to be a control organ. Sometimes itsmovements are voluntarily controlled while atother times it is driven by external events. With thetarget selection by dwell time method, consideredmore natural than selection by blinking [7], onehas to be conscious of where one looks and howlong one looks at an object. If one does not look ata target continuously for a set threshold (e.g., 200ms), the target will not be successfully selected.Once the cursor position had been redefined, the user would needto only make a small movement to, and click on, the target with aregular manual input device. We have designed two MAGICpointing techniques, one liberal and the other conservative interms of target identification and cursor placement.Eye tracker:-Since the goal of this work is to explore MAGIC

Page 11: Seminar Report

pointing as a user interface technique, we startedout by purchasing a commercial eye tracker (ASLModel 5000) after a market survey. In comparisonto the system reported in early studies this systemis much more compact and reliable. However, wefelt that it was still not robust enough for a varietyof people with different eye characteristics, such aspupil brightness and correction glasses. We hencechose to develop and use our own eye trackingsystem. Available commercial systems, such asthose made by ISCAN Incorporated, LCTechnologies, and Applied Science Laboratories(ASL), rely on a single light source that ispositioned either off the camera axis in the case ofthe ISCANETL-400 systems, or on-axis in the caseof the LCT and theASL E504 systems.Eye tracking data can be acquired simultaneously with

Page 12: Seminar Report

MRI scanning using a system that illuminates the left eyeof a subject with an infrared (IR) source, acquires a videoimage of that eye, locates the corneal reflection (CR) ofthe IR source, and in real time calculates/displays/recordsthe gaze direction and pupil diameter.Once the pupil has been detected, the cornealreflection is determined from the dark pupil image.The reflection is then used to estimate the user'spoint of gaze in terms of the screen coordinateswhere the user is looking at.An initial calibrationprocedure, similar to that required by commercialeye trackers.Implementing magic pointing:-We programmed the two MAGIC pointingtechniques on a Windows NT system. Thetechniques work independently from theapplications. The MAGIC pointing program takesdata from both the manual input device (of anytype, such as a mouse) and the eye tracking

Page 13: Seminar Report

system running either on the same machine or onanother machine connected via serial port. Rawdata from an eye tracker can not be directly usedfor gaze-based interaction, due to noise fromimage processing, eye movement jitters, andsamples taken during saccade (ballistic eyemovement) periods.The goal of filter design in general is to make thebest compromise between preserving signalbandwidth and eliminating unwanted noise. In thecase of eye tracking, as Jacob argued, eyeinformation relevant to interaction lies in thefixations.Our filtering algorithm was designed to pick afixation with minimum delay by means of selectingtwo adjacent points over two samples.Artificial intelligent speechrecognition:-It is important to consider the environment inwhich the speech recognition system has to work.The grammar used by the speaker and accepted by

Page 14: Seminar Report

the system, noise level, noise type, position of themicrophone, and speed and manner of the user’sspeech are some factors that may affect thequality of speech recognition .When you dial thetelephone number of a big company, you are likelyto hear the sonorous voice of a cultured lady whoresponds to your call with great courtesy saying“Welcome to company X. Please give me theextension number you want”. You pronounce theextension number, your name, and the name ofperson you want to contact. If the called personaccepts the call, the connection is given quickly.This is artificial intelligence where an automaticcall-handling system is used without employing anytelephone operator.

Application:-One of the main benefits of speech recognitionsystem is that it lets user do other workssimultaneously. The user can concentrate on

Page 15: Seminar Report

observation and manual operations, and stillcontrol the machinery by voice input commands.Another major application of speech processing isin military operations. Voice control of weapons isan example. With reliable speech recognitionequipment, pilots can give commands andinformation to the computers by simply speakinginto their microphones—they don’t have to usetheir hands for this purpose. Another good exampleis a radiologist scanning hundreds of X-rays,ultrasonograms, CT scans and simultaneouslydictating conclusions to a speech recognitionsystem connected to word processors. Theradiologist can focus his attention on the imagesrather than writing the text. Voice recognition couldalso be used on computers for making airline andhotel reservations. A user requires simply to statehis needs, to make reservation, cancel areservation, or make enquiries about schedule.The simple user interface tracker:-

Page 16: Seminar Report

Computers would have been much more powerful,had they gained perceptual and sensory abilities ofthe living beings on the earth. What needs to bedeveloped is an intimate relationship between thecomputer and the humans. And the Simple UserInterest Tracker (SUITOR) is a revolutionaryapproach in this direction.By observing the Webpage a netizen is browsing,the SUITOR can help by fetching more informationat his desktop. By simply noticing where the user’seyes focus on the computer screen, the SUITORcan be more precise in determining his topic ofinterest.the Almaden cognitive scientistwho invented SUITOR, "the system presents thelatest stock price or business news stories thatcould affect IBM. If I read the headline off theticker, it pops up the story in a browser window. If I

Page 17: Seminar Report

start to read the story, it adds related stories to theticker. That's the whole idea of an attentive system—one that attends to what you are doing, typing,reading, so that it can attend to your informationneeds.Conclusion:-The nineties witnessed quantum leaps interfacedesigning for improved man machine interactions.The BLUE EYES technology ensures a convenientway of simplifying the life by providing moredelicate and user friendly facilities in computingdevices. Now that we have proven the method, thenext step is to improve the hardware. Instead ofusing cumbersome modules to gather informationabout the user, it will be better to use smaller andless intrusive units. The day is not far when thistechnology will push its way into your house hold,

Page 18: Seminar Report

making you more lazy. It may even reach yourhand held mobile device. Any way this is only atechnological forecast.THANK YOUSeminar reportonSubmitted to:-Submitted by:-Prof. S.K. prabhash Ranjeetpratap singhE.C. 3rdyear(0904EC061088)