eye gaze
DESCRIPTION
Eye GazeTRANSCRIPT
![Page 1: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/1.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
ACKNOWLEDGEMENT
I would like to express my sincere gratitude and reverence to God Almighty, for
guiding me through this seminar, making my endeavor an undiluted success.
I am deeply indebted to our respected principal Dr.N.N.VIJAYA RAAGHAVAN for
his timely advice and constant encouragement. I extend my sincere thanks to
MRS.ANOOPA JOSE CHITTILAPPILLY, Head of Department of Applied
Electronics & Instrumentation Engineering, for encouraging and aiding me
throughout the seminar.
I would like to express our heartfelt thanks to my seminar coordinator Ms.RESHMA
RAMACHANDRAN, lecturer, AEI for her selfless support, understanding and
involvement.
I extend sincere and genuine appreciation to seminar guide Mrs.NEETHU
SATHYAN, lecturer, AEI whose help throughout the seminar cannot be substituted
by anything.
In course of completion of the seminar I fortunate to receive the assistance of many
faculties, friends and relatives who were extremely generous with their valuable
suggestions, time and energy. I would like to thank all of them and recognize the fact
that without them this seminar would have been inconceivable.
DEPT OF AEI IESCE 1
![Page 2: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/2.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
ABSTRACT
The problem of eye gaze tracking has been researched and developed for a
long time. Most of them use intrusive techniques to estimate the gaze of a person.
This paper presents anon-intrusive approach for eye gaze tracker in real time with a
simple camera. To track the eye gaze we have to deal with three principle problems:
detecting the eye, tracking the eye and detecting the gaze of the eye on the screen
where a user is looking at. In this paper we introduce the methods existed to solve
these problems in the simple way and achieving high detection rate. The Eye-gaze
System is a direct-select vision-controlled communication and control system. It was
developed in Fairfax, Virginia, by LC Technologies, Inc., This system is mainly
developed for those who lack the use of their hands or voice. Only requirements to
operate the Eye-gaze are control of at least one eye with good vision & ability to keep
head fairly still. Eye-gaze Systems are in use around the world. Its users are adults and
children with cerebral palsy, spinal cord injuries, brain injuries, ALS, multiple sclerosis,
brainstem strokes, muscular dystrophy, and Werdnig-Hoffman syndrome. Eye-gaze
Systems are being used in homes, offices, schools, hospitals, and long-term care
facilities. By looking at control keys displayed on a screen, person can synthesize
speech, control his environment (lights, appliances, etc.), type, operate a telephone,
run computer software, operate a computer mouse, and access the Internet and e-mail.
Eye-gaze Systems are being used to write books, attend school and enhance the
quality of life of people with disabilities all over the world.
DEPT OF AEI IESCE 2
![Page 3: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/3.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
TABLE OF CONTENTS
CHAPTER PAGE No.
ACKNOWLEDGMENT i
ABSTRACT ii
TABLE OF CONTENTS iii
LIST OF FIGURES iv
LIST OF TABLES v
1. INTRODUCTION 01
2. WORKING 02
2.1 WORKING OF EYE-GAZE SYSTEM
2.2 TRACKING OF EYE MOVEMENT
2.3 LONGEST LINE SCANNING
2.4 OCEM
2.5 ESTIMATION OF GAZING
2.6 EYE DETECTING
2.7 ESTIMATION ALGORITHMS
3. OPERATIONAL REQUIREMENTS 10
3.1 LOW AMBIENT INFRARED LIGHT
3.2 EYE VISIBILITY
3.3 GLASSES & CONTACT LENSES
4. EYE GAZE PERFORMANCE SPECIFICATIONS 12
5. MENU OF EYE GAZE SYSTEM 14
6. SKILLS NEEDED BY USERS 18
7. EYE-GAZE TRACKER OUTPUT 19
8. DEVELOPMENTS IN EYE-GAZE SYSTEM 20
9. ADVANTAGES OF EYE GAZE SYSTEM 25
10. PERSONAL CONTRIBUTIONS AND VIEWS 31
11. CONCLUSION 26
REFERENCES 22
DEPT OF AEI IESCE 3
![Page 4: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/4.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
LIST OF FIGURES
No. NAME PAGE No.
1. Combined Eye-Gaze System 02
2. Working Diagram of an Eye-Gaze System 03
3. Detecting of Eye Pupil 04
4. Principles of LLS 05
5. Matching Process in OCEM 06
6. Reference Model: 2D Simple Mark 06
7. Two examples of positive image 07
8. Geometry around Eye-Gaze 08
9. Practical Eye Gaze System 11
10. Main Menu 14
11. Telephone Menu 14
12. Typewriter Menu 15
13. Run PC Menu 16
14. Light & Appliances Menu 17
15. Eye-Gaze System in Wheel Chair 20
16. Computer Interaction 21
17. Eye-Gage Video Streaming 24
DEPT OF AEI IESCE 4
![Page 5: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/5.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
LIST OF TABLES
No. NAME PAGE No.
1 Accuracy 12
DEPT OF AEI IESCE 5
![Page 6: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/6.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
1. INTRODUCTION
Most of us blessed to operate the computer with ease using our hands. But
there are some who can’t use their hands and for them the voice guided systems
have been in use for quite some time now. But what about paralytic patients with no
mobility and speech? Even when their brains are functional and they are visually
and aurally blessed to know what is going around. So shouldn’t they be able to
effectively use their intelligence and stay employed? Now they can with the Eye-
gaze communication system.
Detecting of eye gaze is used in a lot of human computer interaction
applications. Most of them use intrusive techniques to estimate the gaze of a person.
For example, user has to wear a headgear camera to fix the position of their eyes with
the view of screen on the camera, or use an infrared light on camera to detect the eye.
In this paper, I introduce a nonintrusive approach which is very cheap solution to
detect the eye gaze with a camera simple, user does not have to wear the headgear or
using any expensive equipment.
The Eye-gaze System is a direct-select vision-controlled communication and
control system. It was developed in Fairfax, Virginia, by LC Technologies, Inc., This
system is mainly developed for those who lack the use of their hands or voice. Only
requirements to operate the Eye-gaze are control of at least one eye with good vision
& ability to keep head fairly still. Eye-gaze Systems are in use around the world. Its
users are adults and children with cerebral palsy, spinal cord injuries, brain injuries,
ALS, multiple sclerosis, brainstem strokes, muscular dystrophy, and Werdnig-Hoffman
syndrome. Eye-gaze Systems are being used in homes, offices, schools, hospitals, and
long-term care facilities. By looking at control keys displayed on a screen, person can
synthesize speech, control his environment (lights, appliances, etc.), type, operate a
telephone, run computer software, operate a computer mouse, and access the Internet
and e-mail. Eye-gaze Systems are being used to write books, attend school and
enhance the quality of life of people with disabilities all over the world.
DEPT OF AEI IESCE 6
![Page 7: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/7.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
2. WORKING
2.1 Working of Eye Gaze System
As a user sits in front of the Eye-gaze monitor, a specialized video camera
mounted below the monitor observes one of the user's eyes. Sophisticated image-
processing software in the Eye-gaze System's computer continually analyzes the
video image of the eye and determines where the user is looking on the screen.
Nothing is attached to the user's head or body.
Fig.2.1 Combined Eye-Gaze System
In detail the procedure can be described as follows: The Eye-gaze System uses
the pupil-center/corneal-reflection method to determine where the user is looking on
the screen. An infrared-sensitive video camera, mounted beneath the System's
monitor, takes 60 pictures per second of the user's eye. A low power, infrared light
emitting diode (LED), mounted in the center of the camera's lens illuminates the eye.
The LED reflects a small bit of light off the surface of the eye’s cornea. The light also
shines through the pupil and reflects off of the retina, the back surface of the eye, and
causes the pupil to appear white. The bright-pupil effect enhances the camera's image
of the pupil and makes it easier for the image processing functions to locate the center
of the pupil.
The computer calculates the person's gaze point, i.e., the coordinates of where
he is looking on the screen, based on the relative positions of the pupil center and
DEPT OF AEI IESCE 7
![Page 8: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/8.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
corneal reflection within the video image of the eye. Typically the Eye-gaze System
predicts the gaze point with an average accuracy of a quarter inch or better.
Fig.2.2 Working Diagram of an Eye-Gaze System
Prior to operating the eye tracking applications, the Eye-gaze System must
learn several physiological properties of a user's eye in order to be able to project his
gaze point accurately. The system learns these properties by performing a calibration
procedure. The user calibrates the system by fixing his gaze on a small yellow circle
displayed on the screen, and following it as it moves around the screen. The
calibration procedure usually takes about 15 seconds, and the user does not need to
recalibrate if he moves away from the Eye-gaze System and returns later.
2.2 Tracking of Eye Movement
The location of face and eye should be known for tracking eye movements.
We assume this location information has already been obtained through extant
techniques. Exact eye movements can be measured by special techniques. This
investigation concentrates on tracking eye movement itself. Two algorithms have
been proposed for iris center detection: the Longest Line Scanning and Occluded
Circular Edge Matching algorithms. The emphasis is on eye movement in this paper,
not on face and eye location. Rough eye position is not sufficient for tracking eye
gaze accurately. Measuring the direction of visual attention of the eyes requires more
precise data from eye image. A distinctive feature of the eye image should be
measured in any arrangement. The pupil of people having dark or dark-brown eyes
can hardly be differentiated from the iris in the captured images. If the image is
DEPT OF AEI IESCE 8
![Page 9: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/9.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
captured from close range, then it can be used to detect the pupil even under ordinary
lighting conditions.
Fig.2.3 Detecting of Eye Pupil
It was decided to track the iris for this reason. Due to the fact that the sclera is
light and the iris is dark, this boundary can easily be optically detected and tracked. It
can be quite appropriate for people with darker iris color (for instance, Asians).
Young has addressed the iris tracking problem using a head-mounted camera.
2.3Longest Line Scanning
Human eyes have three degrees of freedom of rotation in 3D space. Actually,
the eye image is a projection of the real eye. The iris is nearly a circular plate attached
to the approximately spherical eyeball. The projection of the iris is elliptical in shape.
The following well known property is useful in this regard. It can be applied to the
problem of detection of the iris center. The algorithm is outlined below:
DEPT OF AEI IESCE 9
![Page 10: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/10.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
Fig 2.4: Principles of LLS
Searching and decision after edge detection enhances
computational efficiency. Except when preprocessing fails, it
computes the center of the iris quite accurately. But it is sensitive to
distribution of edge pixels.
2.4 Occluded Circular Edge matching (OCEM)
Although the LLS method detects the center of the iris, it is not sufficient for
measuring eye-gaze precisely. The following problems are noted on a closer look at
LLS technique
The only clues to find the center of the iris are left and right edge pixels of the
iris boundary, the so called limbus. In order to estimate the original position and shape
of the iris boundary, the circular edge matching (CEM) method can be adapted. The
angle of rotation of the eyeball and the eccentricity of the ellipse are not large, when
the subject sits and operates the computer in front of the screen. This observation
justifies a circular approximation to the ellipse. Experimental results justify this
simplification. The algorithm is outlined below:
DEPT OF AEI IESCE 10
![Page 11: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/11.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
Fig 2.4: Matching Process in OCEM
2.5 Estimation of Gazing Point
As previous work has reported, gaze estimation with free head movement is
very difficult to deal with. The focus is on estimating the orientation of the eyes with
slight head movement. It is very important to estimate it from the image features and
values measured at the stage of eye movement tracking. The direction of eye-gaze,
including the head orientation is considered in this investigation. A geometric model
incorporating a reference has been devised. The geometry consisting of subject’s face,
camera, and computer screen has been explored so as to understand eye-gaze in this
environment. Finally, a couple of estimation methods have been proposed. A small
mark attached to the glasses stuck between two lenses has been adopted for the
purpose of the special rigid origin (Figure 3). This provides the following geometric
information.
The position of subject’s face
The origin of the iris center movement
Fig2.5: Reference Model: 2D Simple Mark
It cannot offer any orientation information at all, because it is like a small spot.
Nevertheless, it can compensate for slight head movement.
DEPT OF AEI IESCE 11
![Page 12: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/12.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
2.6 Eye Detecting
We use the rapid object detection scheme based on a boosted cascade of
simple Haar-like feature to detect the eye. This method has been initially proposed by
Paul Viola and improved by Rainer Lienhart . First, the classifiers are trained with
thousands of positive and negative image (image include object and image non object)
based on simple features (called Haar-like feature). There is a large of number of
features in a sub-window of image 24x24 pixels.
The algorithm of training is AdaBoost, it trains classifiers and in the same time
select a small set of features which are the best separates the positive and negative
examples. After training, only small set of features are selected and these features can
be combined to form an effective classifier. After the classifiers are trained, a cascade
of classifiers is constructed to increase the detection performance while radically
reducing computation time. The cascade of classifiers can be constructed to reject
many of the negative sub-windows while detecting almost all possible instances.
Simple classifiers are used to reject the majority of sub windows before more
complex classifiers are called to reduce the time computation and to achieve low false
positive rates.
Fig 2.6: Two examples of positive image
We get images from a lot of sources and select the eye manually. Then we use
the create samples utility in Open CV to create training samples and using
haartraining utility to train the classifier. It takes about five days to train our classifier
with machine Pentium(R) 4 CPU 3.00GHz. If we use this classifier to tracking the eye
in the real time, it will be very heavy, because it has to search the eye in all frames of
camera. So we use this classifier to detect the eye in the first frame and then use
another method to track the eye for all the next frames.
2.7 Estimation Algorithms
In this section, the techniques to determine gazing points on the computer
screen are discussed. The Geometry-Based Estimation is, indeed, based on the
DEPT OF AEI IESCE 12
![Page 13: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/13.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
geometry of the eye gaze discussed in the previous section. Adaptive Estimation
determines the eye-gaze with the help of displacements in the image. Regardless of
which of these techniques is actually employed, image-to-screen mapping requires
that the system be initialized first. It should be calibrated while in use. During
initialization, the subject intentionally gazes at predefined screen points. From the
resulting eye movement data, other gazing points can be estimated. During the
calibration, because subject moves continuously, changes in the parameters (such as
the radius or iris, the distance, or the head position arising due to subject movements)
are incorporated in the estimation process, thereby reconstructing the parameter set.
2.7.1 Geometry-Based Estimation
The subject first gazes at the center of the screen, and then, slightly moves and
gazes at the right end of the screen. Figure shows its geometry S is the distance
between two screen points. A is the displacement of the reference model.
The equation will be:
S=k {d+r
r( Δ 2−Δ1 )+ Δref }
Fig 4: Geometry around Eye-Gaze
During initialization, the value of k is expected to be different depending on
the direction towards each predefined screen points. The different value of k can be
DEPT OF AEI IESCE 13
![Page 14: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/14.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
computed at this initialization stage. The value S refers to the gazing point. The
situation is the same as in the initialization step:
2.7.2 Adaptive Estimation
This technique adaptively uses only the displacement of the iris center and the
displacement of the reference model. Based only on initialization data, it determines
gazing point by linear approximation. It involves the following Algorithm:
DEPT OF AEI IESCE 14
![Page 15: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/15.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
3. OPERATIONAL REQUIREMENTS
3.1 Low Ambient Infrared Light
There must be low levels of ambient infrared light falling on the subject's eye.
Stray IR sources obscure the lighting from the Edge Analysis System's light emitting
diode and degrade the image of the eye. The sun and incandescent lamps contain high
levels of infrared light. The environment may be brightly illuminated with lights such
as fluorescent or mercury-vapor which do not emit in the infrared region of the
spectrum. The Edge Analysis System also works well in the dark.
3.2 Eye Visibility
The camera must have a clear view of the subject's eye. If either his pupil or
the corneal reflections are occluded, there may be insufficient image information to
make an accurate gaze measurement. The camera's view of the eye can be obstructed
by, an object between the camera and the eye .The person's nose or cheek if his head
is rotated too much with respect to the camera, or by excessive squinting. Alternative
Edge Analysis software is included to accommodate for an obstructed image of the
top of the pupil, usually caused by a droopy eyelid or an unusually large pupil. The
software returns a "false" condition for the Eye Found flag whenever an adequate
image of the eye is not present.
3.3 Glasses and Contact Lenses
In most cases, eye tracking works with glasses and contact lenses. The
calibration procedure accommodates for the refractive properties of the lenses. When
wearing glasses, the glasses may not be tilted significantly downward, or the
reflection of the LED off the surface of the glass is reflected back into the camera and
obscures the image of the eye. The lens boundary in hard-line bifocal or trifocal
glasses often splits the camera's image of the eye, and the discontinuity in the image
invalidates the image measurements. The corneal reflection is obtained from the
contact lens surface rather than the cornea itself.
DEPT OF AEI IESCE 15
![Page 16: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/16.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
Fig 3.1: Practical Eye Gaze System
Soft contact lenses that cover all or most of the cornea generally work well
with the Edge Analysis System. The corneal reflection is obtained from the contact
lens surface rather than the cornea itself. Small, hard contacts can cause problems,
however, if the lenses move around considerably on the cornea, and the corneal
reflection moves across the discontinuity between the contact lens and the cornea.
DEPT OF AEI IESCE 16
![Page 17: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/17.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
4. EYE GAZE PERFORMANCE SPECIFICATIONS
4.1 Accuracy
Table:4.1 AccuracyEye-Gaze Measurement Angular Gaze
OrientationSpatial
Gaze PointTypical Average Bias Error(over the monitor screen range)
0.45 degree 0.15 inch
Maximum Average Bias Error(over the monitor screen range)
0.70 degree 0.25 inch
Frame-to-frame variation 0.18 degree 0.06 inch
Bias errors result from inaccuracies in the measurement of head range,
asymmetries of the pupil opening about the eye's optic axis, and astigmatism. They
are constant from frame to frame and cannot be reduced by averaging or smoothing.
Frame-to-frame variations result from image brightness noise and pixel
position quantization in the camera image and may be reduced by averaging or
smoothing.
4.2 Speed
Sampling rate: 60 MHz
4.3 Angular Gaze track Range
As the eye's gaze axis rotates away from the camera, the corneal reflection
moves away from the center of the cornea. Accurate gaze angle calculation ceases
when the corneal reflection "falls off" the edge of the cornea. The eye's gaze axis may
range up to 40 degrees away from the camera, depending on the arc of the person’s
cornea.
DEPT OF AEI IESCE 17
![Page 18: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/18.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
Gaze Cone diameter: 80 Degrees
The lower 15 degrees of the gaze cone, however, are generally clipped due to
the upper eyelid blocking the corneal reflection when the eye is looking down below
the camera.
4.4 Tolerance to Head Motion
Lateral Range: 1.5 inch (3.8 cm)
Vertical Range: 1.2 inch (3.0 cm)
Longitudinal Range: 1.5 inch (3.8 cm)
In fixed-camera Edge Analysis Systems, the eye must remain within the field
of view of the camera. However, if the subject moves away from the camera's field of
view, eye tracking will resume once he returns to a position where his eye is again
visible to the camera.
4.5 Computer Usage
Memory Consumption: 6 MB
CPU Time Consumption: 30-50%
4.6 Light Emitting Diode
Wave Length: 880 nanometers (near infrared)
Beam Width: 20 degrees, between half power points
Radiated Power: 20 mill watts, radiated over the 20 degree beam width
Safety Factor: 5 -- At a range of 15 inches the LED illumination
on the eye is 20% of the HEW max permissible exposure.
DEPT OF AEI IESCE 18
![Page 19: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/19.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
5. MENU OF EYE GAZE SYSTEM
The Main Menu appears on the screen as soon as the user completes a 15-
second calibration procedure. The Main Menu presents a list of available Eye-gaze
programs. The user calls up a desired program by looking at the Eye-gaze key next to
his program choice.
Fig 5.1: Main Menu
5.1 The telephone program
The telephone program allows the user to place and receive calls. Frequently
used numbers are stored in a telephone "book".
Fig 5.2 Telephone Menu
DEPT OF AEI IESCE 19
![Page 20: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/20.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
5.2 The Phrase Program
The Phrases program, along with the speech synthesizer, provides quick
communications for non-verbal users. Looking at a key causes a preprogrammed
message to be spoken. The Phrases program stores up to 126 messages, which can be
composed and easily changed to suit the user.
5.3 Typewriter Program
Simple word processing can be done using the Typewriter Program. The user
types by looking at keys on visual keyboards. Four keyboard configurations, simple to
complex, are available. Typed text appears on the screen above the keyboard display.
The user may "speak" or print what he has typed. He may also store typed text in a
file to be retrieved at a later time. The retrieved text may be verbalized, edited or
printed.
Fig 5.3: Typewriter Menu
5.4 Run Second PC
The Run Second PC program permits the Eye-gaze Communication System to
act as a peripheral keyboard and Mouse interface to a Windows computer. The user
can run any off-the-shelf software he chooses on the second computer. He can access
the Internet, and send e-mail by looking at keyboard and mouse control screens on the
Eye-gaze monitor. The programs being run are displayed on the second computer's
monitor. Typed text appears simultaneously on the Eye-gaze and second pc's screens
DEPT OF AEI IESCE 20
![Page 21: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/21.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
For children, two new Eye-gaze programs have been added to the Eye-gaze
System. Both run with the Second PC option. Eye Switch is a big, basic on-screen
switch to run "cause & effect" software programs on a Second PC. Simple Mouse is
an easy mouse control program to provide simplified access to educational software
on a Second PC.
Fig 5.4: Run PC Menu
5.5 Television
Television programs can be displayed directly on the desktop Eye-gaze System
screen. On-screen volume and channel controls provide independent operation (Not
available on the Portable Eye-gaze System.).A web browsing system using eye-gaze
input. Recently, the eye-gaze input system was reported as a novel human-machine
interface. We have reported a new eye gaze input system already. It utilizes a personnel
computer and a home video camera to detect eye-gaze under natural light. In this paper,
we propose a new web browsing system for our conventional eye-gaze input system.
5.6 Paddle games & Score Four
These are the visually controlled Games.
5.7 Read Text Program
The Read Text Program allows the user to select text for display and to "turn pages"
with his eyes. Any ASCII format text can be loaded for the user to access. Books on
floppy disk are available from Services for the Blind.
DEPT OF AEI IESCE 21
![Page 22: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/22.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
5.8 The Lights & appliances Program
The Lights & appliances Program which includes computer-controlled
switching equipment, provides Eye-gaze control of lights and appliances anywhere in
the home or office. No special house wiring is necessary. The user turns appliances on
and off by looking at a bank of switches displayed on the screen
Fig 5.5: Light & Appliances Menu
DEPT OF AEI IESCE 22
![Page 23: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/23.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
6. SKILLS NEEDED BY USERS
6.1 Good control of one eye
The user must be able to look up, down, left and right. He must be able to fix
his gaze on all areas of a 15-inch screen that is about 24 inches in front of his face. He
must be able to focus on one spot for at least 1/2 second.
6.2 Adequate vision
The user should be able to view the screen correctly.
6.3 Ability to maintain a position in front of the Eye-gaze monitor
It is generally easiest to run the System from an upright, seated position, with
the head centered in front of the Eye-gaze monitor. However the Eye-gaze System can
be operated from a semi-reclined position if necessary. Continuous, uncontrolled
head movement can make Eye-gaze operation difficult, since the Eye-gaze System
must relocate the eye each time the user moves away from the camera’s field of view
and then returns. Even though the System’s eye search is completed in just a second
or two, it will be more tiring for a user with constant head movement to operate the
System
6.4 Mental Abilities:
Ability to read
Memory
Cognition
DEPT OF AEI IESCE 23
![Page 24: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/24.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
7. EYE-GAZE TRACKER OUTPUTS
DEPT OF AEI IESCE 24
Eye-Found Flag:Eye Found - true/false flag indicating whether or not
the eye image was found this camera field; this flag
goes false, for example, when a person blinks,
squints excessively, looks outside the gaze cone or
exits the head position envelope.
Gaze point:X Gaze, Y gaze - intercept of the gaze line on the
monitor screen plane or other user-defined plane
such as a control panel; in inches, millimeters, or
computer monitor pixels, measured with respect to
the center of the screen.
Pupil Diameter:Pupil Diameter Mm - pupil diameter, measured in
millimeters.
Synchronization
Counter:
Camera Field Count - a time counter indicating the
number of the camera fields that have occurred
since a user specified reference.
![Page 25: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/25.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
8. DEVELOPMENTS IN EYE-GAZE TECHNOLOGY
LC technologies have recently developed a Portable Eye-gaze System. The
Portable Eye-gaze System can be mounted on a wheelchair and run from a 12-volt
battery or wall outlet. It weighs only 6 lbs (2.7 kg) and its dimensions are 2.5"x8"x9"
(6.5cm x20cm x23cm). The Portable Eye-gaze System comes with a flat screen
monitor and a table mounts for its monitor. The monitor can be lifted off the table
mount and slipped into a wheelchair mount.
Fig 8.1:Eye-Gaze System in Wheel Chair
8.1 Computer interaction using Eye-Gaze system
Vision-based user-computer interfaces include both eye-gaze pointing and
gestures have reviewed user interface interaction based on eye-gaze control. Eye-
gaze, when tracked and made use of for control of image displays, may suffer from
computational requirements leading to latency problems, and obtrusiveness of the
camera and positioning apparatus, we completely overcome problems related to
latency, as well as achieve a relatively successful solution relating to obtrusiveness,
based on the eye tracking environment used. Two further issues had to be addressed
during implementation. Eye-gaze coordinates are at all times subject to additional
small seemingly random displacements. The latter resulted in “flickering” of the eye-
gaze coordinates. Even though the coordinates given by the eye-gaze tracking system
are averaged over a number of values output coordinate “flickering” was quite
DEPT OF AEI IESCE 25
![Page 26: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/26.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
appreciable. Rather than applying a moving average smoothing filter, we dealt with
this issue by finding a good compromise between the tolerance and the index
reference parameters. Larger values of the tolerance parameter reduce the flickering
effects, but also reduce the resolution of the Visual Mouse. Smaller values of the
index reference parameter will generate the mouse click more quickly, which will
decrease the effect of flickering. We found our trade-off values of these two system
parameters to be robust for the applications to be described below. However we do
not exclude the possibility that filtering of the incoming eye-gaze coordinate data
stream could well lead to a more automated approach. We are currently studying the
statistical properties of empirical eye-gaze coordinate data streams with such issues in
mind.
Fig 8.2:Computer Interaction
A direct implication of reducing the resolution of the Visual Mouse is as
follows. The Visual Mouse works very well when hot links with big icons are
involved. Dealing with smaller clickable icon links, however, is troublesome. An
example of the latter was our attempt to use the Back button on a regular web browser
window, in the context of web surfing. The Back button proved to be too small for the
Visual Mouse to operate effectively.
A second issue addressed was related to the accuracy of the calibration
procedure. The Procedure for calibrating the PC screen for each subject and session
used nine points in order to calculate the mapping that relates the subject’s angle of
gaze with positional coordinates on the approximately planar PC monitor screen. It is
crucial to achieve good calibration of these nine points for positional accuracy of
subsequent subject eye-gaze location. Notwithstanding the accuracy with which this is
DEPT OF AEI IESCE 26
![Page 27: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/27.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
done, there is some decrease of accuracy whenever the subject looks at the PC screen
at locations away from the calibration points. One way to avoid the decrease in
accuracy in defining eye-gaze positions on a planar screen is to use a greater number
of calibration points. A seventeen-point calibration is possible, and will give better
results, but it requires appreciably more time to carry out. In summary, we can gain in
accuracy of pinpointing eye-gaze locations at the expense of time and effort (and
hence subject fatigue) taken in calibration.
All results reported on below used nine point calibration which provided an
adequate trade-off between the subject’s work on calibration, and the positional
precision of the data consequently obtained.
8.2 Large Image Display in Astronomy and Medicine
Support of the transfer of very large images in a networked (client-server)
setting requires compression, prior noise separation, and, preferably, progressive
transmission. The latter consists of visualizing quickly a low-resolution image, and
then over time, increasing the quality of the image. A simple form of this, using
block-based regions of interest, was used in our work. Figure illustrates the design of
a system allowing for decompression by resolution scale and by region block. It is the
design used in grayscale and color compression algorithms implemented in MR
(2001). Systems have been prototyped which allow for decompression at full
resolution in a particular block, or at given resolutions in regions around where the
user points to with a cursor. Wavelet transform based methods are very attractive for
support of compression and full resolution extraction of regions of interest, because
they integrate a multi-resolution concept in a natural way. Figures 3a and 3b
exemplify a standalone system on a portable PC using cultural heritage images. This
same technique can be used to view digitized land-use maps.
Two demonstrators had been set up on the web prior to this Visual Mouse
work. The cultural heritage image example shown in Figures 3a and 3b is further
discussed at http://strule.cs.qub.ac.uk/zoom.html. This image is originally a JPEG
image (including compression) of size 13 MB, and with decompression it is of size 1
MB. Decompression of a block is carried out in real time. The compression method
used, which supports color, is lossy and is based on the widely used biorthogonal 9/7
Daubechies-Antonini wavelet transform.
DEPT OF AEI IESCE 27
![Page 28: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/28.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
A medical example is accessible at http://strule.cs.qub.ac.uk/imed.html. This
image has been compressed from 8.4 MB to 568 kB, and again decompression (and
format conversion) of limited area blocks is carried out effectively in real time. The
compression method used in this case is rigorously loss-less, and supports grayscale
images. The multi-resolution transform used is a pyramidal median transform. Further
details on the compression algorithms used can be found in Starck et al. (1996, 1998),
Louys et al. (1999a, 1999b), Murtagh et al. (1998, 2001a, 2001b).
The eye-gaze control system can be used to operate these web-based
demonstrations. The block sizes are sufficiently large that no precision problems are
encountered with these types of applications.
General context for such scientific and medical applications is as follows. New
display and interaction environments for large scientific and medical images are
needed. With pixel dimensions up to 16,000 x 16,000 in astronomy, which is the case
of detectors at the CFHT (Canada-France-Hawaii Telescope, Hawaii) and the UK’s
Vista telescope to be built at the European Southern Observatory’s facility at Cerro
Paranal, Chile, it is clear that viewing “navigation” support is needed. Even a
digitized mammogram in telemedicine, of typical pixel dimensions 4500 x 4500,
requires a display environment. Our work concerns therefore both image
compression, and also a range of other allied topics – progressive transmission, views
based on resolution scale, and quick access to full-resolution regions of interest.
Ongoing work by us now includes the following goals: (i) better design of
web-based display, through enhancing these demonstrators (including more
comprehensive navigation support for large image viewing, and a prototyping of eye-
gaze controlled movement around three-dimensional scenes); and (ii) support for
further Visual Mouse interaction modes. Chief among interaction modes is a “back”
or “return” action based on lack of user interest, expressed as lack of gaze
concentration in a small region.
8.3 Eye-Gaze Control of Multiple Video Streams
The new approaches to interacting with multiple streams of multimedia data
shown are a number of presentations from a recent workshop, each with a streaming
video record of what was discussed and presented. If the observer’s eye dwells
DEPT OF AEI IESCE 28
![Page 29: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/29.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
sufficiently long on one of the panels, the video presentation is displayed in the panel.
If the observer’s interest remains as measured by eye-gaze on the panel, the video
continues to display. At any time, the observer’s interest may wander. If his or her eye
dwells sufficiently long on another panel then the previously playing video is replaced
with a name-plate, and a video stream now plays in the new panel. An HTML
OBJECT element is used to insert an ActiveX component into the HTML document
as well as all of the necessary information to implement and run the object.
Fig 8.3: Eye-Gage Video Streaming
DEPT OF AEI IESCE 29
![Page 30: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/30.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
9. ADVANTAGES OF EYE GAZE SYSTEM
A wide variety of disciplines use eye tracking techniques, including cognitive
science, psychology (notably psycholinguistics, the visual world paradigm), human-
computer interaction (HCI), marketing research and medical research (neurological
diagnosis). Specific applications include the tracking eye movement in language
reading, music reading, human activity recognition, the perception of advertising, and
the playing of sport. Uses include:
Eye gaze systems are being used to write books, attend school and enhance
equality of life of people with disabilities all over the world.
Type a letter
Operate a telephone
Run computer software
Operate a computer mouse, and access the internet and e-mail
Medical Research
Laser refractive surgery
Human Factors
Computer Usability
Translation Process Research
Vehicle Simulators
In-vehicle Research
Training Simulators
Adult Research
Sports Training
MRI / MEG / EEG
Finding good clues
Communication systems for disabled
Improved image and video communications
DEPT OF AEI IESCE 30
![Page 31: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/31.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
10. PERSONAL CONTRIBUTIONS AND VIEWS
Through this seminar a most reliable and accurate eye gaze system was
developed. This is argues that it is possible to use the eye-gaze of a computer user in
the interface to aid the control of the application’s. I found out that eye gaze system
can be improved or spread a lot beyond its current position. In its initial stage eye
gaze systems are only linked with software parts but I understood that it can also be
linked with hardware sections and becoming more friendly to users. It is argued that
eye-gaze tracking data is best used in multimodal interfaces where the user interacts
with the data instead of the interface, in so-called non-command user interfaces.
DEPT OF AEI IESCE 31
![Page 32: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/32.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
11. CONCLUSION
Today, the human eye-gaze can be recorded by relatively unremarkable
techniques. This thesis argues that it is possible to use the eye-gaze of a computer user
in the interface to aid the control of the application. Care must be taken, though, that
eye-gaze tracking data is used in a sensible way, since the nature of human eye-
movements is a combination of several voluntary and involuntary cognitive processes.
The main reason for eye-gaze based user interfaces being attractive is that the
direction of the eye-gaze can express the interests of the user-it is a potential porthole
into the current cognitive processes-and communication through the direction of the
eyes is faster than any other mode of human communication. It is argued that eye-
gaze tracking data is best used in multimodal interfaces where the user interacts with
the data instead of the interface, in so-called non-command user interfaces.
DEPT OF AEI IESCE 32
![Page 33: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/33.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
REFERENCES
(1)LAURENCE R. YOUNG AND DAVID SHEENA, Survey of Eye Movement Recording Methods, Behavior Research Methods and Instrumentation, Vol. 7, No. 5,pp. 397-429, 1975.
(2)ARIE E. KAUFMAN, AMIT BANDOPADHAY, BERNARDD. SHAVIV, An Eye Tracking Computer User Interface , Research Frontier in Virtual Reality Workshop Proceedings, IEEE Computer Society Press, pp. 78-84.October 1993,
(3)GLENN A. MYERS, KEITH R. SHERMAN, AND LAWRENCE STARK, Eye Mornitor, IEEE Computer Magazine, Vol. March, pp. 14-21, 1991. YOSHINOBU EBISAWA, Improved Video-Based Eye- Gaze Detection Method, IEEE IMTC ‘94, Hamamatsu, May, 1998.
(4)THOMAS E. HUTCHINSON, K. PRESTON WHITE, JR., WORTHY N. MARTIN, KELLY C. REICHERT, AND LISA A. FREY, Human-Computer Interaction Using Eye- Gaze Input, IEEE Trans. on Systems, Man, and Cybernetics, Vol. 19, No. 6, pp. 1527-1534, 1998.
(5)C. COLOMBO, S. ANDRONICO, AND P. DARIO, Prototype of a Vision-Based Gaze-Driven Man-Machine Interface, Proceedings IEEE/RSJ International Conference on Intelligent Robots and Systems, August, 1995.
(6)CHRISTOPHE COLLET, ALAIN FINKEL, AND RACHID GHERBI, CapRe: A Gaze Tracking System in Man- Machine Interaction, Proceedings of IEEE International Conference on Intelligent Engineering Systems, September, 1997.
(7)BAOSHEN Hu AND MINGHUA QIU, A New Method for Human-Computer Interaction by using Eye Gaze, Proceedings of IEEE International Conference on Systems, Man and Cybernetics, October, 1994. (8)RAINER STIEFELHAGEN, Gaze Tracking for Multimodal Human-Computer Interaction, Diplomarbeit, Universitiit Karlsruhe, September, 1996.
(9)SHUMEET BALUJA AND DEAN POMERLEAU, Non- Intrusive Gaze Tracking Using Artificial Neural Networks, CMU Technical Report, CMU-CS-94-102, School of Computer Science, Carnegie Mellon University,January, 1994.
(10)PHILIPPE BALLARD AND GEORGE C. STOCKMAN, Computer Operation via Face Orientation, Pattern Recognition, Vol.l. Conference A: Computer Vision and Applications, Proceedings., 11th IAPR International Conference, 1992.
DEPT OF AEI IESCE 33
![Page 34: Eye Gaze](https://reader030.vdocuments.site/reader030/viewer/2022013105/553171124a7959102d8b492c/html5/thumbnails/34.jpg)
SEMINAR REPORT - 2011 EYE GAZE TRACKING
(11)A. H. GEE AND R. CIPOLLA, Determining the Gaze of Faces in Images, Technical Report, CUED/FINFENG/ TR 174, Department of Engineering, University of Cambridge, March, 1994.(12) P. Viola, M. Jones, “Rapid Object Detection using a Boosted Cascade of Simple Features” , IEEE vol. 2, 2001.
(12) R. Lienhart, J. Maydt, “An Extended Set of Haar-like Features for Rapid Object Detection”, vol. 1, no. 1, pp. 900-903, 2002.
(13) J. Bouguet, “Pyramidal Implementation of the Lucas Kanade FeatureTracker Description of the algorithm”, Intel Corporation, Microprocessor Research Labs, OpenCV Document, 1999
(14) C. Rasmussen, C. Williams, and I. Books24x7, “Gaussian Processes forMachine Learning”. Springer, 2006.
DEPT OF AEI IESCE 34