research article a system of driving fatigue detection ...research article a system of driving...

12
Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device Wanzeng Kong, 1 Lingxiao Zhou, 1 Yizhi Wang, 1 Jianhai Zhang, 1 Jianhui Liu, 1 and Shenyong Gao 2 1 College of Computer Science, Hangzhou Dianzi University, Hangzhou 310018, China 2 School of Information Engineering and Art Design, Zhejiang University of Water Resources and Electric Power, Hangzhou 310018, China Correspondence should be addressed to Wanzeng Kong; [email protected] Received 2 September 2014; Accepted 16 December 2014 Academic Editor: Mehmet Karakose Copyright © 2015 Wanzeng Kong et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Driving fatigue is one of the most important factors in traffic accidents. In this paper, we proposed an improved strategy and practical system to detect driving fatigue based on machine vision and Adaboost algorithm. Kinds of face and eye classifiers are well trained by Adaboost algorithm in advance. e proposed strategy firstly detects face efficiently by classifiers of front face and deflected face. en, candidate region of eye is determined according to geometric distribution of facial organs. Finally, trained classifiers of open eyes and closed eyes are used to detect eyes in the candidate region quickly and accurately. e indexes which consist of PERCLOS and duration of closed-state are extracted in video frames real time. Moreover, the system is transplanted into smart device, that is, smartphone or tablet, due to its own camera and powerful calculation performance. Practical tests demonstrated that the proposed system can detect driver fatigue with real time and high accuracy. As the system has been planted into portable smart device, it could be widely used for driving fatigue detection in daily life. 1. Introduction Driving fatigue is a common phenomenon due to long time of driving or lack of sleep. It is a significant potential hazard in traffic safety. As many as 100,000 traffic accidents caused by driving fatigue which led to 400,000 people injured and 1550 people killed happened in the United States each year [1]. Research on driving fatigue detection is becoming a popular issue all over the world. Currently, detection methods of driving fatigue can be mainly divided into four categories [2]. (1) First category is methods based on drivers’ physiological signal [3, 4]. Such kind of physiological signal includes electroencephalograph (EEG), electrocardiograph (ECG), and electrooculogram (EOG). ese methods usually result in good performance of fatigue detection. However, how to conveniently obtain clean signals is a problem to be solved in practical applications [57]. (2) Second category is methods based on drivers’ operation behavior. Literatures reported that driving fatigue can be detected through drivers’ operation such as steering wheel operation [8, 9]. When the driver is falling into fatigue, he will reduce the grip strength of steering wheel or decrease the ability of controlling the steering wheel [10]. (3) ird category is methods based on vehicle state. e trail of vehicle and lane departure information are also extra useful information for detecting fatigue [11]. Both trail and lane information are correlated with controlling steering wheel; hence, they also reflect the driver’s operation but are noncontacted. (4) Fourth category is methods based on drivers’ physiological reaction. Fatigue can be detected by physiological behavior such as blinking and yawning, among which the most effective method is based on detection of eye states [1215]. Generally speaking, the frequency and duration of eye closed-state will increase and those of eye open state will decrease when drivers become fatigued. Eriksson and Papanikolopoulos [16] proposed a method that eye states can be recognized by a camera fixed on dashboard. And drivers’ fatigue can be detected by recognizing closed- state in continuous 2 or 2.5 seconds. Furthermore, when fatigue occurs, people oſten yawn and their mouths will open Hindawi Publishing Corporation Journal of Sensors Volume 2015, Article ID 548602, 11 pages http://dx.doi.org/10.1155/2015/548602

Upload: others

Post on 25-Feb-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

Research ArticleA System of Driving Fatigue Detection Based on Machine Visionand Its Application on Smart Device

Wanzeng Kong1 Lingxiao Zhou1 Yizhi Wang1 Jianhai Zhang1

Jianhui Liu1 and Shenyong Gao2

1College of Computer Science Hangzhou Dianzi University Hangzhou 310018 China2School of Information Engineering and Art Design Zhejiang University of Water Resources and Electric PowerHangzhou 310018 China

Correspondence should be addressed to Wanzeng Kong kongwanzenghdueducn

Received 2 September 2014 Accepted 16 December 2014

Academic Editor Mehmet Karakose

Copyright copy 2015 Wanzeng Kong et alThis is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

Driving fatigue is one of themost important factors in traffic accidents In this paper we proposed an improved strategy andpracticalsystem to detect driving fatigue based on machine vision and Adaboost algorithm Kinds of face and eye classifiers are well trainedby Adaboost algorithm in advanceThe proposed strategy firstly detects face efficiently by classifiers of front face and deflected faceThen candidate region of eye is determined according to geometric distribution of facial organs Finally trained classifiers of openeyes and closed eyes are used to detect eyes in the candidate region quickly and accurately The indexes which consist of PERCLOSand duration of closed-state are extracted in video frames real time Moreover the system is transplanted into smart device that issmartphone or tablet due to its own camera and powerful calculation performance Practical tests demonstrated that the proposedsystem can detect driver fatigue with real time and high accuracy As the system has been planted into portable smart device itcould be widely used for driving fatigue detection in daily life

1 Introduction

Driving fatigue is a common phenomenon due to long timeof driving or lack of sleep It is a significant potential hazardin traffic safety As many as 100000 traffic accidents causedby driving fatigue which led to 400000 people injured and1550 people killed happened in theUnited States each year [1]Research on driving fatigue detection is becoming a popularissue all over the world

Currently detection methods of driving fatigue can bemainly divided into four categories [2] (1) First category ismethods based on driversrsquo physiological signal [3 4] Suchkind of physiological signal includes electroencephalograph(EEG) electrocardiograph (ECG) and electrooculogram(EOG)These methods usually result in good performance offatigue detection However how to conveniently obtain cleansignals is a problem to be solved in practical applications[5ndash7] (2) Second category is methods based on driversrsquooperation behavior Literatures reported that driving fatiguecan be detected through driversrsquo operation such as steering

wheel operation [8 9] When the driver is falling intofatigue he will reduce the grip strength of steering wheelor decrease the ability of controlling the steering wheel [10](3) Third category is methods based on vehicle state Thetrail of vehicle and lane departure information are also extrauseful information for detecting fatigue [11] Both trail andlane information are correlated with controlling steeringwheel hence they also reflect the driverrsquos operation butare noncontacted (4) Fourth category is methods based ondriversrsquo physiological reaction Fatigue can be detected byphysiological behavior such as blinking and yawning amongwhich the most effective method is based on detection ofeye states [12ndash15] Generally speaking the frequency andduration of eye closed-state will increase and those of eyeopen state will decrease when drivers become fatiguedEriksson and Papanikolopoulos [16] proposed a method thateye states can be recognized by a camera fixed on dashboardAnd driversrsquo fatigue can be detected by recognizing closed-state in continuous 2 or 25 seconds Furthermore whenfatigue occurs people often yawn and their mouths will open

Hindawi Publishing CorporationJournal of SensorsVolume 2015 Article ID 548602 11 pageshttpdxdoiorg1011552015548602

2 Journal of Sensors

obviously so the detection of mouth state by a camera is alsoan effective method for detecting driver fatigue Shi et al [17]used BP neural network and computer vision to detectmouthstates to estimate driverrsquos mental state

However all above methods must use some special andextra equipment For example methods based on driversrsquooperation behavior [18] require pressure sensor [19] andangular transducer to detect driverrsquos behavior on steeringwheel Methods based on both vehicle state and driversrsquophysiological reaction require cameras to record inside andoutside conditions of the car In particular the methodsbased on physiological signal require more expensive andcomprehensive EEG acquisition equipment such as EEG capelectrodes and signal amplifier Moreover all these methodsusually need an extra computer or embedded computingboard for signal processing and making decisions

Nowadays tablet and smartphone are so popular thatalmost every driver holds one And most of them areequipped with excellent camera and powerful comput-ingprocessing unit The main contribution of this paper isthat we proposed an improved strategy and practical systemto detect driving fatigue using smart device Firstly a practicalmachine vision system based on improved Adaboost algo-rithm is developed to detect driving fatigue by checking theeye state in real time Then this system is easily transplantedinto a smart device such as a tablet to perform the taskof fatigue detection Practical tests demonstrate that it isconvenient and of low cost to detect driving fatigue by smartdevice but it is as effective as other methods

The rest of this paper is organized as follows Section 2provides a detailed description of system including theAdaboost algorithm face detection eye localization staterecognition and also the whole detection strategy Exper-iment results and data analysis are presented in Section 3The system transplantation is given in Section 4 Finally theconclusion is presented in Section 5

2 Adaboost Algorithm and Improved System

The system is composed of four parts that is image pre-processing face detection eye state recognition and fatigueevaluation In this system images are obtained by externalcamera with high resolution which is placed on front leftof drivers The first step is to denoise the images Thenafter detecting the face region eyes location and states canbe obtained easily and quickly based on the detected faceregion Finally driversrsquo fatigue can be detected by analyzingthe square wave diagram of eye states in real time

21 Image Preprocessing Naturally video images are alwayscontaminated by noise in different degree which roots inseveral factors such as driving condition underexposureoverexposure or nonlinearity of devices Original images areusually converted to gray scale images to be easily used inclassifiers Images are easilymalfunctioned including lackingcontrast and blurring Thus histogram equalization is thenext important step The goal of histogram equalization isto highlight the features by enhancing contrast of gray scale

images and reducing interference caused by the asymmetricillumination

22 Adaboost Algorithm and Classifier Training Adaboostalgorithm is a kind of boosting algorithmproposed by Freundand Schapire [20] which selects some weak classifiers andintegrates them into strong classifiers automatically

Viola and Jones [21] proposed an Adaboost algorithmbased on Haar characteristics and trained frontal face classi-fier with this algorithmThe algorithm is with high detectionaccuracy and faster than almost all the other real-time algo-rithms The excellent performance of frontal face classifiercontributes a lot to the Adaboost algorithm

Steps of Adaboost Algorithm Select training samples as fol-lows (119909

1 1199101) (1199092 1199102) (119909

119899 119910119899) and 119909

119894is the 119894th training

sample while 119910119894isin 0 1 represents positive or negative

samples respectively The positive samples could be used totrain classifier to identify similar characteristics the negativesamples used to eliminate differences characteristics Thenumber of positive samples is 119904 and the number of negativesamples is119898 119899 = 119898 + 119904

(1) Initialize the weight of each sample 1199081119894

= 1119899 119894 =

1 2 119899(2) Repeat the following four steps for 119905 = 1 2 119879 (119879

is the optimal number of weak classifiers)

(a) Normalize the weight into a probability distri-bution

119902119905119894=

119908119905119894

sum119899

119895=1119908119905119894

(1)

(b) For each feature 119891 train a weak classifier ℎ119891

and compute weighted error rates 120576119891of weak

classifiers corresponding to all features

120576119891= sum119902

119905119894

1003816100381610038161003816ℎ119905 (119909119894) minus 1199101198941003816100381610038161003816 (2)

(c) Choose the best weak classifier ℎ119905(119909) from step

(b) which has the minimum error rate 120576119905

(d) According to the best weak classifier adjustweight

119908119905+1119894

= 1199081199051198941205731minus119890119894

119905 (3)

119890119894= 0 represents correct classification 119890

119894= 1

represents misclassification

(3) Form the strong classifier eventually

ℎ (119909) =

1

119879

sum119905=1

120572119905ℎ119905(119909) ge

1

2

119879

sum119905=1

120572119905

0 others

where 120572119905= log

1 minus 120576119905

120576119905

(4)

Journal of Sensors 3

Figure 1 Haar characteristics

In this paper Haar features are used to train Adaboostclassifier Extraction of Haar characteristics and calculationof features are two important aspects of Adaboost algo-rithm Figure 1 shows five simplest kinds of gray rectanglefeatures There are 160000 rectangle features in a size of24 times 24 pixels which shows its complexity and diversityThe specific computing method is to subtract the numberof pixels in black rectangles from that in white rectanglesViola and Jones [21] illustrated that the speed of training anddetection will be greatly enhanced by applying integral imageto feature computation In Violarsquos method rectangle featurescan be obtained by computing several points of integralimage Integral image is defined as follows for a point (119909 119910)in an image its integral image is 119894119894(119909 119910)

119894119894 (119909 119910) = sum

1199091015840le1199091199101015840le119910

119894 (1199091015840 1199101015840) (5)

where 119894(1199091015840 1199101015840) is the gray value of the point (119909 119910)A large number of collections of target and nontarget

images are used to build positive and negative samples librarywhen training a specific target classifier

In this paper we use face images from MIT Yale andORL face databases to train the face classifiers In total 3000samples of front faces and 1500 samples of deflected faces areused and normalized into a resolution of 20 times 20

For eye detection this system also trained the eyes-open and eyes-closed classifiers with Adaboost algorithmHowever there is no public human eyes library availablefor both positive and negative samples of eyes Hence weestablished an eye library by ourselves In Figure 2 a largenumber of single eyes can be obtained through processedimages in face library They are classified into an opened-eyes library and a closed-eyes library Eventually the numbersof open-eyes and closed-eyes samples are 1190 and 700respectively with normalized resolution of 15 times 7 pixels

After establishing the eye library Adaboost algorithmconsumes most of its time on training the target classifierswhich shows a good precision and real-time performance indetection

Open-eye library

Closed-eye library

Figure 2 Face library and constructed eyes library

23 Improved Detection Strategy In this paper Adaboostalgorithm proposed by Viola is employed to train faceclassifiers and detect face However traditional methods areonly focusing on front face samples If the deflected angle ofdriverrsquos face is small the algorithm can be well applied to facedetection in this condition But classifier would be failed tocapture the target when driverrsquos face deflected in a large angle

In view of the above shortcomings we improved the facedetection stratagem based on Violarsquos method Left and rightdeflected face classifiers can be obtained by training left andright deflected face samples If the front face detection isfailed the other two deflected-face classifiers will be involvedin the detection task In the worst case there will be 3 timesof detections for deflected faces Hence if face classifiers areconstantly dealing with such kind of situation the speed offace detection will be obviously reduced

To deal with this problem our system optimizes schedul-ing algorithm of face classifiers Specifically when the systemmisses the target using the front face classifier right deflectedface classifier is called firstly to re-detect If it succeeds theright deflected-face classifier will be used in next frame bydefault Otherwise the left deflected face classifier is calledto work If detection succeeds similarly left deflected-faceclassifier will be used default in next frame The front faceclassifier is called to retest when both left and right faceclassifiers lose the target This strategy consumes a littlemore time on the detection of frames which lose targets butimproves the real-time performance of whole system

In Adaboost algorithm face detection region is obtainedby searching the whole picture The larger the picture thelonger the time of searching which will be consumed Inorder to improve the detection speed each frame of pictureis downsampled into a quarter of the original picture Inpractical application reasonable scaling of pictures has littleeffect on accuracy of detection or precision of localizationCompared to the traditional style the speed can be increasedby half or more to meet the requirement of real time

24 Eye Localization and State Recognition The commonmethods of eyes localization include integral projectionHough transition template matching and principal compo-nent analysis In this paper Adaboost algorithm is exploitedto train open-eyes and closed-eyes classifiers for locating eyes

Eye localization is carried out in the detected face regionFace is a regular geometry where each organ presents regulardistribution In order to improve precision and efficiency ofdetection face region can be segmented before locating eyes

Figure 3(a) shows face regions detected by face classifiersFace can be divided by the following analysis

4 Journal of Sensors

(a)

(b)

Figure 3 Estimation of eyes candidate region

(1) The height of eyes region can be constrained to lessthan 13 height of the face and more than 110 heightof the face

(2) The width of single eye region is less than 12 width ofthe face and more than 14 width of the face

To guarantee the eyes classifiersrsquo performance withoutthe influence of face segmentation we make the followingconstraints

(1) 119910 = 511986716(2) ℎ = 511986724

where 119910 and ℎ represent the highest pixel and the height ofestimated eyes region respectively and 119867 represents heightof face region

Figure 3(b) shows the result after partition and it illus-trates that the region in face which is below eyebrows andabove nostril can be used as eyes candidate region Inthis region eyes are detected by open-eyes and closed-eyesclassifiers This method avoids regions of nostril mouth andabove forehead that could affect human eyes detection Inpractical test it obviously reduces the error probability ofdetection

Open-eyes classifier is firstly used to detect target areaThe results will be marked by red rectangles if open-eyesstate is detected If it loses the target the closed-eyes classifierwill be switched to work And the detection result will bealso marked by yellow rectangles for closed-eye state Ifclose classifier loses target this frame will be regarded as anexception of eye detection

25 Fatigue Judgment withMultiple Indexes PERCLOS is thepercentage of duration of closed-eye state in a specific timeinterval (1min or 30 s) [22] PERCLOS is a well-recognizedand effective measure of neurophysiological fatigue level Foreach frame a result will be output at the end of detectionThe result could be open-eyes state closed-eyes state faceexception and eyes exception The RERCLOS is drawn inreal time by counting detection results with certain frames

Open-eye detectionClosed-eye detection

Face detection exceptionEye detection exception

Time

Stat

e

Figure 4 PERCLOS of human eyes state

in a fixed period as shown in Figure 4 Then fatigue can bedetected by analyzing PERCLOS and duration of closed-eyesstate

Only two eyes states (open and closed) can be detectedin this system The level of open-eyes state is not analyzedPERCLOS will be simplified based on the percentage ofduration of completely eye closed in 30 secondsThe formulaof simplified PERCLOS value is as follows

PERCLOS = 119905

30times 100 (6)

where 119905 is the duration that eyes are completely closedThe duration of closed-eyes state in driverrsquos blink will

increase when fatigue occurs In this paper the duration of12 seconds is the threshold which is used as judging fatiguestate for drivers Hence driverrsquos fatigue can be detected withtwo indexes which are PERCLOS index and the threshold of12 s duration

A driving fatigue detection system is developed by usingPERCLOS and eye states as shown in Figure 5The systemwilldetect fatigue and ring the bell when one of the two indexessatisfies the requirement

3 Experiment and Data Analysis

Our system runs on a PCwith Intel Core(TM)2Duo 210GHzCPU and 2GB RAM using the third-party library OpenCVto perform system tests in Visual Studio 2008 Racing gamewas used as driving conditions Nine subjects took partin the experiment with simulate driving environment Theresolution of these videos is 352 times 288 pixels and the framerate is 25 fps Each frame contains human face

We recorded ten groups of video streams in this experi-mentThe videos are divided into two categoriesThe first cat-egory (ie 1stndash4th groups) shows various facial expressionswhich occurred in three subjectsrsquo simulate driving experi-ments There are 1630 frames in each group of videos Thesevideos are used to test the performance of face detection andeyes detection

The second category consists of the remaining 6 groups(ie 5thndash10th) Six subjects were asked to implement drivingtasks for long enough time to become fatigued finally Thesevideos are used to extract fatigue index In order to collectvalid videos for assessment of driver fatigue subjects areinvolved in training and experimental sessions The wholeexperiment duration for one subject may last two or moredays it depends on his training performance Before experi-ment subjects are asked not to eat chocolate and drink coffeeor alcohol The length of each video is about 70 minutes

Journal of Sensors 5

Frame input

PreprocessingSuccess

Start End

Eye detection Fatigue judgment Result output

Success

FailureSecond timesface detection

First timesface detection

FailureSuccess

Failure

Set default face classifier

Is task overNo

Yes

Figure 5 System flow chart

Table 1 Experimental results

Detected target Target (frames) Correct detection (frames) Missed detection (frames) Detection rate ()Human face 6520 6214 293 95Open state 5392 4912 480 91Close state 1128 1015 113 90

There are 105000 frames in each group of videos In thefirst 55 minutes of experiment subjects are asked to takeintensive simulate driving Many curves and steep slopes arepresented in driving conditions And extra tasks of alert andvigilance (TAV) [23] were exerted in order to ensure subjectsconcentrating on driving highly during the experiment Inthe last 15 minutes we relieved subjectsrsquo stress by decliningmissions and using a flat road with fewer curves Themonotony of driving induced driver to be fatigued Obviousfeatures of fatigue for subjects in this simulate driving can besummarized as increased blink times blinks frequency andduration of closed-eyes state

31 Detection Performance of Our Method Figure 6 showsthe results of human face detection In order to distinguishdetection results the result of front face classifier was markedas a white rectangle and the result of deflected face classifierwas marked as a green rectangle The first row shows theresults which were detected by the front face classifiermarked with white rectangles Face was lost because the rightdeflection angle of face is too large in 3rd frame The secondrow shows the results which were detected by deflected faceclassifiers marked with green rectangles And in the secondrow face was lost by the deflected classifier in the first andfourth frame because the faces were in the state of lookingstraight aheadThe third row shows the results detected by thetwo classifiers Hence all faces including front and deflectedconditions were detected successfully This method can carryout real-time detection with a high accuracy

Figure 7 shows the results of eyes localization andthe state recognition under different expressions Eyes aremarked with rectangles specifically closed-eye state was

targeted with yellow rectangle And open-eye state wastargeted with red rectangle

In this section we also revealed the general performancesof the proposed method in terms of face detection eyeslocalization and state recognition We perform our methodon the first category of videos (ie 1stndash4th groups) whichcontained 6520 frames consisting of open-eye state and close-eye state The correct detected frames are manually countedDetail results are presented in Table 1 From Table 1 we canconclude that the proposed system works quite well withaccuracy of at least 90

32 Comparison Experiment Compared with the traditionalAdaboost method that trains eye classifiers to detect eyestate directly we proposed an improved strategy to detecteye state The proposed strategy mainly includes 3 stepsand 2 categories of classifiers corresponding to face andeye respectively Specifically we first detect face efficientlyby classifiers of both front face and deflected face Thencandidate region of eye is determined according to geometricdistribution of facial organs Finally trained classifiers ofopen eyes and closed eyes are used to detect eyes in thecandidate region quickly and accurately Table 2 shows thecomparison of elapsed time for 2 methods in PC systemTheshortest and longest time for processing a single frame arealso presented Concerning our method although it needsto detect both face and eye the average processing speedof the system is up to 20 fps Hence the system has a goodperformance of real time The longest processing time ofsingle frame is more than 2 times of the shortest one becauseit takes at least 2 times to detect face and eye state due tothe situation of deflected face Concerning the traditional

6 Journal of Sensors

Figure 6 Results of face detection

Figure 7 Results of eyes location and state recognition

Journal of Sensors 7

Table 2 Comparison of time consumption in PC system (ms)

Methods Mean time consumption of several detection stepsFront face Deflected face Open eye Close eye Other Single Frame (longest) Single Frame (shortest)

Our method 16 18 5 4 12 73 33Traditional method mdash mdash 96 97 mdash 245 93

Figure 8 Comparison results with our method and the traditional method The frames of top row are detected by direct Adaboost trainingeye classifier while the frames of bottom row are detected by our improved method

method in spite of detecting eye with only one step directlyit costs more time to detect eyes The reason is that it needsto use eye classifier searching in the whole frame instead of asmall candidate region as our method

Figure 8 shows comparison of detection performancebetween our method and the traditional method Figure 8illustrates that our method can detect eyes precisely inthe restricted face region However the traditional methodemployed the eye classifier to detect eye in the whole pictureAs a result there are a lot of false detections Based on thecomparison experiment we can conclude that our methodoutperforms the traditional method in terms of detectionspeed and accuracy

33 Determining the Fatigue Index In order to judge thefatigue state automatically we should fix the parameter ofthreshold to determine alert or fatigue There is an obviouschange of PERCLOS values at the beginning and the endof driving experiment The subjectrsquos blink frequency andclosed-eye state duration will increase after subjects becomefatigued Figure 9 shows PERCLOS values of six subjects inthe periods of 10sim17 (the beginning) and 63sim70 minutes (theend) From Figure 9 we can conclude that PERCLOS valuesrise significantly at the end of experiment and the escalatingrate of mean values is more than 200 Additionally sixsubjectsrsquo PERCLOS values showed individual differencesSubjectsrsquo mean values such as SUB1 SUB2 SUB4 and SUB5are relatively low (lt0025) both in the beginning and in theend of experiment But for some subjects such as SUB3 andSUB6 the PERCLOS mean values are relatively higher thanother subjects For example the PERCLOS values of SUB3and SUB6 at the beginning and the end are 0016 0042 0059and 0146 respectively

Variation of PERCLOS mean values of each subject arepresented in Table 3 In this table the PERCLOS value showsindividual differences Although the PERCLOS values ofSUB3 and SUB6 are extremely large their escalating rateskeep quite stable as the other subjects From Table 3 we findthat the escalating rate of all subjects is between 20 and 30except SUB1 Thus it should be a good choice to treat theescalating rate of PERCLOS value as an index for evaluatingdriver fatigue In this paper we judge the driver as fatiguedwhen his PERCLOS increment rate is more than 2

4 Fatigue Detection on Smart Device

Nowadays smartphones and tablet devices have quite pow-erful processing capacity and most of these devices arealso equipped with high resolution camera (front or rear)Such kinds of smart devices are installed with operatingsystem such as Android and iOS Especially for Androidoperating system it is a free and open system hence itis very convenient to do secondary development for usersFurthermore OpenCV is compatible with these Android-based devicesTherefore it is feasible to transplant our fatiguedetection system based onmachine vision into smart devicesAs a result we designed aDriving FatigueDetectionWarningSystem (DFDWS) on the Android smart device withmachinevision The composition of detection system is shown inFigure 10

This system is developed based on Android OS andOpenCV Library which are called OpenCV4AndroidOpenCV4Android is available as a java lib on the InternetIt is easy to install and configure OpenCV4AndroidSDK on the smart handheld device according to thetutorial OpenCV4Android provides an interface named

8 Journal of Sensors

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

times10minus3

0 1 2 3 4 5 6 70

0005

001

0015

002

0025

PERC

LOS

Sub 1

0

001

002

003

004

005

006

007

PERC

LOS

Sub 5

0

002

004

006

008

01

012

014

016

PERC

LOS

Sub 3

0

1

2

3

4

5

6

7

8

PERC

LOS

Sub 2

0

0005

001

0015

Time (min) Time (min)Time (min)

Time (min) Time (min)Time (min)

PERC

LOS

Sub 4

0

005

01

015

02

025

03

035

PERC

LOS

Sub 6

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

Figure 9 PERCLOS values in the beginning (10sim17m) and the end (63sim70m) of driving experiment

Table 3 Mean values of PERCLOS in different stages in 6 subjects

PERCLOS SubjectsSub 1 Sub 2 Sub 3 Sub 4 Sub 5 Sub 6 Mean

10sim17m (at the beginning) 0001 0007 0016 0001 0002 0042 001263sim70m (at the end) 0011 0025 0059 0003 0006 0146 0042Difference 0010 0018 0043 0002 0004 0104 0030Escalating rate 10000 2571 2688 2000 2000 2476 3623

Journal of Sensors 9

Smartphone and tablet device

Driver Fatigue Detect Warning System

Fatigue detectionOpenCV lib

Camera Facialimages

Dialer

Speaker

Fatiguecharacter

Call the police

Warning

Imaging

processing

PERCLOS index

Duration status

InvokeInvoke and run Run Invoke

Figure 10 Detection system composition

(a) (b)

Figure 11 Detection result on tablet

CvCameraViewListener to catch camera frames and aclass CascadeClassifier to detect object In our applicationwe create several CascadeClassifier classes which areimplemented as different classifiers corresponding to faceopen-eye or closed-eye When a frame is captured themethod onCameraFrame will be invoked automaticallyIn the method frame image will be translated into classMat which is represented as images in OpenCV Then themethod detectMultiScalewill be invoked by CascadeClassifierto detect face feature

Thus any Android phone or tablet with a camera dialerand speaker could transplant this system DFDWS useOpenCV library to call the camera to collect facial imagesThe image frames in the memory constitute an image streamthen the proposed algorithm would deal with it After thatthe system starts to judge driverrsquos fatigue state according toPERCLOS or duration time of closed-eye state If the driverfalls into fatigue state then the system would turn on thespeaker to remind himher or dial the emergency center

Also detection result would be shown on the screen intimeThe interface of the proposed transplanting system anddetection result are shown in Figure 11

In this paper our system runs well on a Nexus7 tabletwhich is with 1 GHz CPU and 1GB RAM In the beginning ofthe experiment the systemwould collect a period of 10min tocalculate the PERCLOS index as a baseline for judging fatiguestate During this period drivers should adjust device andbody posture to make sure that the system could accuratelyidentify faces and eyesThe index of PERCLOS escalating ratecorresponding to baseline is used to detect fatigue in smarthandheld system When PERCLOS escalating rate increasedmore than 200 the system on Nexus7 tablet will judge thedriver as fatigued Table 4 shows the detection performanceonNexus7 It demonstrates the average consumption of kindsof classifiers and also presents the shortest and longest timeof processing a single frame Compared to the detectionperformance on the PC system (see Table 2) we can find thatthe consumption time for each step is extended for Nexus7tablet Because the main frequency of the tablet is muchlower than the PC (21 GHz) accordingly the longest and theshortest processing time for single frame are also extendedFor normal situation that is the driver being almost infront face the performance of DFDWS on Nexus7 tablet canachieve the speed of 14 framessecThat means mobile device

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 2: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

2 Journal of Sensors

obviously so the detection of mouth state by a camera is alsoan effective method for detecting driver fatigue Shi et al [17]used BP neural network and computer vision to detectmouthstates to estimate driverrsquos mental state

However all above methods must use some special andextra equipment For example methods based on driversrsquooperation behavior [18] require pressure sensor [19] andangular transducer to detect driverrsquos behavior on steeringwheel Methods based on both vehicle state and driversrsquophysiological reaction require cameras to record inside andoutside conditions of the car In particular the methodsbased on physiological signal require more expensive andcomprehensive EEG acquisition equipment such as EEG capelectrodes and signal amplifier Moreover all these methodsusually need an extra computer or embedded computingboard for signal processing and making decisions

Nowadays tablet and smartphone are so popular thatalmost every driver holds one And most of them areequipped with excellent camera and powerful comput-ingprocessing unit The main contribution of this paper isthat we proposed an improved strategy and practical systemto detect driving fatigue using smart device Firstly a practicalmachine vision system based on improved Adaboost algo-rithm is developed to detect driving fatigue by checking theeye state in real time Then this system is easily transplantedinto a smart device such as a tablet to perform the taskof fatigue detection Practical tests demonstrate that it isconvenient and of low cost to detect driving fatigue by smartdevice but it is as effective as other methods

The rest of this paper is organized as follows Section 2provides a detailed description of system including theAdaboost algorithm face detection eye localization staterecognition and also the whole detection strategy Exper-iment results and data analysis are presented in Section 3The system transplantation is given in Section 4 Finally theconclusion is presented in Section 5

2 Adaboost Algorithm and Improved System

The system is composed of four parts that is image pre-processing face detection eye state recognition and fatigueevaluation In this system images are obtained by externalcamera with high resolution which is placed on front leftof drivers The first step is to denoise the images Thenafter detecting the face region eyes location and states canbe obtained easily and quickly based on the detected faceregion Finally driversrsquo fatigue can be detected by analyzingthe square wave diagram of eye states in real time

21 Image Preprocessing Naturally video images are alwayscontaminated by noise in different degree which roots inseveral factors such as driving condition underexposureoverexposure or nonlinearity of devices Original images areusually converted to gray scale images to be easily used inclassifiers Images are easilymalfunctioned including lackingcontrast and blurring Thus histogram equalization is thenext important step The goal of histogram equalization isto highlight the features by enhancing contrast of gray scale

images and reducing interference caused by the asymmetricillumination

22 Adaboost Algorithm and Classifier Training Adaboostalgorithm is a kind of boosting algorithmproposed by Freundand Schapire [20] which selects some weak classifiers andintegrates them into strong classifiers automatically

Viola and Jones [21] proposed an Adaboost algorithmbased on Haar characteristics and trained frontal face classi-fier with this algorithmThe algorithm is with high detectionaccuracy and faster than almost all the other real-time algo-rithms The excellent performance of frontal face classifiercontributes a lot to the Adaboost algorithm

Steps of Adaboost Algorithm Select training samples as fol-lows (119909

1 1199101) (1199092 1199102) (119909

119899 119910119899) and 119909

119894is the 119894th training

sample while 119910119894isin 0 1 represents positive or negative

samples respectively The positive samples could be used totrain classifier to identify similar characteristics the negativesamples used to eliminate differences characteristics Thenumber of positive samples is 119904 and the number of negativesamples is119898 119899 = 119898 + 119904

(1) Initialize the weight of each sample 1199081119894

= 1119899 119894 =

1 2 119899(2) Repeat the following four steps for 119905 = 1 2 119879 (119879

is the optimal number of weak classifiers)

(a) Normalize the weight into a probability distri-bution

119902119905119894=

119908119905119894

sum119899

119895=1119908119905119894

(1)

(b) For each feature 119891 train a weak classifier ℎ119891

and compute weighted error rates 120576119891of weak

classifiers corresponding to all features

120576119891= sum119902

119905119894

1003816100381610038161003816ℎ119905 (119909119894) minus 1199101198941003816100381610038161003816 (2)

(c) Choose the best weak classifier ℎ119905(119909) from step

(b) which has the minimum error rate 120576119905

(d) According to the best weak classifier adjustweight

119908119905+1119894

= 1199081199051198941205731minus119890119894

119905 (3)

119890119894= 0 represents correct classification 119890

119894= 1

represents misclassification

(3) Form the strong classifier eventually

ℎ (119909) =

1

119879

sum119905=1

120572119905ℎ119905(119909) ge

1

2

119879

sum119905=1

120572119905

0 others

where 120572119905= log

1 minus 120576119905

120576119905

(4)

Journal of Sensors 3

Figure 1 Haar characteristics

In this paper Haar features are used to train Adaboostclassifier Extraction of Haar characteristics and calculationof features are two important aspects of Adaboost algo-rithm Figure 1 shows five simplest kinds of gray rectanglefeatures There are 160000 rectangle features in a size of24 times 24 pixels which shows its complexity and diversityThe specific computing method is to subtract the numberof pixels in black rectangles from that in white rectanglesViola and Jones [21] illustrated that the speed of training anddetection will be greatly enhanced by applying integral imageto feature computation In Violarsquos method rectangle featurescan be obtained by computing several points of integralimage Integral image is defined as follows for a point (119909 119910)in an image its integral image is 119894119894(119909 119910)

119894119894 (119909 119910) = sum

1199091015840le1199091199101015840le119910

119894 (1199091015840 1199101015840) (5)

where 119894(1199091015840 1199101015840) is the gray value of the point (119909 119910)A large number of collections of target and nontarget

images are used to build positive and negative samples librarywhen training a specific target classifier

In this paper we use face images from MIT Yale andORL face databases to train the face classifiers In total 3000samples of front faces and 1500 samples of deflected faces areused and normalized into a resolution of 20 times 20

For eye detection this system also trained the eyes-open and eyes-closed classifiers with Adaboost algorithmHowever there is no public human eyes library availablefor both positive and negative samples of eyes Hence weestablished an eye library by ourselves In Figure 2 a largenumber of single eyes can be obtained through processedimages in face library They are classified into an opened-eyes library and a closed-eyes library Eventually the numbersof open-eyes and closed-eyes samples are 1190 and 700respectively with normalized resolution of 15 times 7 pixels

After establishing the eye library Adaboost algorithmconsumes most of its time on training the target classifierswhich shows a good precision and real-time performance indetection

Open-eye library

Closed-eye library

Figure 2 Face library and constructed eyes library

23 Improved Detection Strategy In this paper Adaboostalgorithm proposed by Viola is employed to train faceclassifiers and detect face However traditional methods areonly focusing on front face samples If the deflected angle ofdriverrsquos face is small the algorithm can be well applied to facedetection in this condition But classifier would be failed tocapture the target when driverrsquos face deflected in a large angle

In view of the above shortcomings we improved the facedetection stratagem based on Violarsquos method Left and rightdeflected face classifiers can be obtained by training left andright deflected face samples If the front face detection isfailed the other two deflected-face classifiers will be involvedin the detection task In the worst case there will be 3 timesof detections for deflected faces Hence if face classifiers areconstantly dealing with such kind of situation the speed offace detection will be obviously reduced

To deal with this problem our system optimizes schedul-ing algorithm of face classifiers Specifically when the systemmisses the target using the front face classifier right deflectedface classifier is called firstly to re-detect If it succeeds theright deflected-face classifier will be used in next frame bydefault Otherwise the left deflected face classifier is calledto work If detection succeeds similarly left deflected-faceclassifier will be used default in next frame The front faceclassifier is called to retest when both left and right faceclassifiers lose the target This strategy consumes a littlemore time on the detection of frames which lose targets butimproves the real-time performance of whole system

In Adaboost algorithm face detection region is obtainedby searching the whole picture The larger the picture thelonger the time of searching which will be consumed Inorder to improve the detection speed each frame of pictureis downsampled into a quarter of the original picture Inpractical application reasonable scaling of pictures has littleeffect on accuracy of detection or precision of localizationCompared to the traditional style the speed can be increasedby half or more to meet the requirement of real time

24 Eye Localization and State Recognition The commonmethods of eyes localization include integral projectionHough transition template matching and principal compo-nent analysis In this paper Adaboost algorithm is exploitedto train open-eyes and closed-eyes classifiers for locating eyes

Eye localization is carried out in the detected face regionFace is a regular geometry where each organ presents regulardistribution In order to improve precision and efficiency ofdetection face region can be segmented before locating eyes

Figure 3(a) shows face regions detected by face classifiersFace can be divided by the following analysis

4 Journal of Sensors

(a)

(b)

Figure 3 Estimation of eyes candidate region

(1) The height of eyes region can be constrained to lessthan 13 height of the face and more than 110 heightof the face

(2) The width of single eye region is less than 12 width ofthe face and more than 14 width of the face

To guarantee the eyes classifiersrsquo performance withoutthe influence of face segmentation we make the followingconstraints

(1) 119910 = 511986716(2) ℎ = 511986724

where 119910 and ℎ represent the highest pixel and the height ofestimated eyes region respectively and 119867 represents heightof face region

Figure 3(b) shows the result after partition and it illus-trates that the region in face which is below eyebrows andabove nostril can be used as eyes candidate region Inthis region eyes are detected by open-eyes and closed-eyesclassifiers This method avoids regions of nostril mouth andabove forehead that could affect human eyes detection Inpractical test it obviously reduces the error probability ofdetection

Open-eyes classifier is firstly used to detect target areaThe results will be marked by red rectangles if open-eyesstate is detected If it loses the target the closed-eyes classifierwill be switched to work And the detection result will bealso marked by yellow rectangles for closed-eye state Ifclose classifier loses target this frame will be regarded as anexception of eye detection

25 Fatigue Judgment withMultiple Indexes PERCLOS is thepercentage of duration of closed-eye state in a specific timeinterval (1min or 30 s) [22] PERCLOS is a well-recognizedand effective measure of neurophysiological fatigue level Foreach frame a result will be output at the end of detectionThe result could be open-eyes state closed-eyes state faceexception and eyes exception The RERCLOS is drawn inreal time by counting detection results with certain frames

Open-eye detectionClosed-eye detection

Face detection exceptionEye detection exception

Time

Stat

e

Figure 4 PERCLOS of human eyes state

in a fixed period as shown in Figure 4 Then fatigue can bedetected by analyzing PERCLOS and duration of closed-eyesstate

Only two eyes states (open and closed) can be detectedin this system The level of open-eyes state is not analyzedPERCLOS will be simplified based on the percentage ofduration of completely eye closed in 30 secondsThe formulaof simplified PERCLOS value is as follows

PERCLOS = 119905

30times 100 (6)

where 119905 is the duration that eyes are completely closedThe duration of closed-eyes state in driverrsquos blink will

increase when fatigue occurs In this paper the duration of12 seconds is the threshold which is used as judging fatiguestate for drivers Hence driverrsquos fatigue can be detected withtwo indexes which are PERCLOS index and the threshold of12 s duration

A driving fatigue detection system is developed by usingPERCLOS and eye states as shown in Figure 5The systemwilldetect fatigue and ring the bell when one of the two indexessatisfies the requirement

3 Experiment and Data Analysis

Our system runs on a PCwith Intel Core(TM)2Duo 210GHzCPU and 2GB RAM using the third-party library OpenCVto perform system tests in Visual Studio 2008 Racing gamewas used as driving conditions Nine subjects took partin the experiment with simulate driving environment Theresolution of these videos is 352 times 288 pixels and the framerate is 25 fps Each frame contains human face

We recorded ten groups of video streams in this experi-mentThe videos are divided into two categoriesThe first cat-egory (ie 1stndash4th groups) shows various facial expressionswhich occurred in three subjectsrsquo simulate driving experi-ments There are 1630 frames in each group of videos Thesevideos are used to test the performance of face detection andeyes detection

The second category consists of the remaining 6 groups(ie 5thndash10th) Six subjects were asked to implement drivingtasks for long enough time to become fatigued finally Thesevideos are used to extract fatigue index In order to collectvalid videos for assessment of driver fatigue subjects areinvolved in training and experimental sessions The wholeexperiment duration for one subject may last two or moredays it depends on his training performance Before experi-ment subjects are asked not to eat chocolate and drink coffeeor alcohol The length of each video is about 70 minutes

Journal of Sensors 5

Frame input

PreprocessingSuccess

Start End

Eye detection Fatigue judgment Result output

Success

FailureSecond timesface detection

First timesface detection

FailureSuccess

Failure

Set default face classifier

Is task overNo

Yes

Figure 5 System flow chart

Table 1 Experimental results

Detected target Target (frames) Correct detection (frames) Missed detection (frames) Detection rate ()Human face 6520 6214 293 95Open state 5392 4912 480 91Close state 1128 1015 113 90

There are 105000 frames in each group of videos In thefirst 55 minutes of experiment subjects are asked to takeintensive simulate driving Many curves and steep slopes arepresented in driving conditions And extra tasks of alert andvigilance (TAV) [23] were exerted in order to ensure subjectsconcentrating on driving highly during the experiment Inthe last 15 minutes we relieved subjectsrsquo stress by decliningmissions and using a flat road with fewer curves Themonotony of driving induced driver to be fatigued Obviousfeatures of fatigue for subjects in this simulate driving can besummarized as increased blink times blinks frequency andduration of closed-eyes state

31 Detection Performance of Our Method Figure 6 showsthe results of human face detection In order to distinguishdetection results the result of front face classifier was markedas a white rectangle and the result of deflected face classifierwas marked as a green rectangle The first row shows theresults which were detected by the front face classifiermarked with white rectangles Face was lost because the rightdeflection angle of face is too large in 3rd frame The secondrow shows the results which were detected by deflected faceclassifiers marked with green rectangles And in the secondrow face was lost by the deflected classifier in the first andfourth frame because the faces were in the state of lookingstraight aheadThe third row shows the results detected by thetwo classifiers Hence all faces including front and deflectedconditions were detected successfully This method can carryout real-time detection with a high accuracy

Figure 7 shows the results of eyes localization andthe state recognition under different expressions Eyes aremarked with rectangles specifically closed-eye state was

targeted with yellow rectangle And open-eye state wastargeted with red rectangle

In this section we also revealed the general performancesof the proposed method in terms of face detection eyeslocalization and state recognition We perform our methodon the first category of videos (ie 1stndash4th groups) whichcontained 6520 frames consisting of open-eye state and close-eye state The correct detected frames are manually countedDetail results are presented in Table 1 From Table 1 we canconclude that the proposed system works quite well withaccuracy of at least 90

32 Comparison Experiment Compared with the traditionalAdaboost method that trains eye classifiers to detect eyestate directly we proposed an improved strategy to detecteye state The proposed strategy mainly includes 3 stepsand 2 categories of classifiers corresponding to face andeye respectively Specifically we first detect face efficientlyby classifiers of both front face and deflected face Thencandidate region of eye is determined according to geometricdistribution of facial organs Finally trained classifiers ofopen eyes and closed eyes are used to detect eyes in thecandidate region quickly and accurately Table 2 shows thecomparison of elapsed time for 2 methods in PC systemTheshortest and longest time for processing a single frame arealso presented Concerning our method although it needsto detect both face and eye the average processing speedof the system is up to 20 fps Hence the system has a goodperformance of real time The longest processing time ofsingle frame is more than 2 times of the shortest one becauseit takes at least 2 times to detect face and eye state due tothe situation of deflected face Concerning the traditional

6 Journal of Sensors

Figure 6 Results of face detection

Figure 7 Results of eyes location and state recognition

Journal of Sensors 7

Table 2 Comparison of time consumption in PC system (ms)

Methods Mean time consumption of several detection stepsFront face Deflected face Open eye Close eye Other Single Frame (longest) Single Frame (shortest)

Our method 16 18 5 4 12 73 33Traditional method mdash mdash 96 97 mdash 245 93

Figure 8 Comparison results with our method and the traditional method The frames of top row are detected by direct Adaboost trainingeye classifier while the frames of bottom row are detected by our improved method

method in spite of detecting eye with only one step directlyit costs more time to detect eyes The reason is that it needsto use eye classifier searching in the whole frame instead of asmall candidate region as our method

Figure 8 shows comparison of detection performancebetween our method and the traditional method Figure 8illustrates that our method can detect eyes precisely inthe restricted face region However the traditional methodemployed the eye classifier to detect eye in the whole pictureAs a result there are a lot of false detections Based on thecomparison experiment we can conclude that our methodoutperforms the traditional method in terms of detectionspeed and accuracy

33 Determining the Fatigue Index In order to judge thefatigue state automatically we should fix the parameter ofthreshold to determine alert or fatigue There is an obviouschange of PERCLOS values at the beginning and the endof driving experiment The subjectrsquos blink frequency andclosed-eye state duration will increase after subjects becomefatigued Figure 9 shows PERCLOS values of six subjects inthe periods of 10sim17 (the beginning) and 63sim70 minutes (theend) From Figure 9 we can conclude that PERCLOS valuesrise significantly at the end of experiment and the escalatingrate of mean values is more than 200 Additionally sixsubjectsrsquo PERCLOS values showed individual differencesSubjectsrsquo mean values such as SUB1 SUB2 SUB4 and SUB5are relatively low (lt0025) both in the beginning and in theend of experiment But for some subjects such as SUB3 andSUB6 the PERCLOS mean values are relatively higher thanother subjects For example the PERCLOS values of SUB3and SUB6 at the beginning and the end are 0016 0042 0059and 0146 respectively

Variation of PERCLOS mean values of each subject arepresented in Table 3 In this table the PERCLOS value showsindividual differences Although the PERCLOS values ofSUB3 and SUB6 are extremely large their escalating rateskeep quite stable as the other subjects From Table 3 we findthat the escalating rate of all subjects is between 20 and 30except SUB1 Thus it should be a good choice to treat theescalating rate of PERCLOS value as an index for evaluatingdriver fatigue In this paper we judge the driver as fatiguedwhen his PERCLOS increment rate is more than 2

4 Fatigue Detection on Smart Device

Nowadays smartphones and tablet devices have quite pow-erful processing capacity and most of these devices arealso equipped with high resolution camera (front or rear)Such kinds of smart devices are installed with operatingsystem such as Android and iOS Especially for Androidoperating system it is a free and open system hence itis very convenient to do secondary development for usersFurthermore OpenCV is compatible with these Android-based devicesTherefore it is feasible to transplant our fatiguedetection system based onmachine vision into smart devicesAs a result we designed aDriving FatigueDetectionWarningSystem (DFDWS) on the Android smart device withmachinevision The composition of detection system is shown inFigure 10

This system is developed based on Android OS andOpenCV Library which are called OpenCV4AndroidOpenCV4Android is available as a java lib on the InternetIt is easy to install and configure OpenCV4AndroidSDK on the smart handheld device according to thetutorial OpenCV4Android provides an interface named

8 Journal of Sensors

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

times10minus3

0 1 2 3 4 5 6 70

0005

001

0015

002

0025

PERC

LOS

Sub 1

0

001

002

003

004

005

006

007

PERC

LOS

Sub 5

0

002

004

006

008

01

012

014

016

PERC

LOS

Sub 3

0

1

2

3

4

5

6

7

8

PERC

LOS

Sub 2

0

0005

001

0015

Time (min) Time (min)Time (min)

Time (min) Time (min)Time (min)

PERC

LOS

Sub 4

0

005

01

015

02

025

03

035

PERC

LOS

Sub 6

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

Figure 9 PERCLOS values in the beginning (10sim17m) and the end (63sim70m) of driving experiment

Table 3 Mean values of PERCLOS in different stages in 6 subjects

PERCLOS SubjectsSub 1 Sub 2 Sub 3 Sub 4 Sub 5 Sub 6 Mean

10sim17m (at the beginning) 0001 0007 0016 0001 0002 0042 001263sim70m (at the end) 0011 0025 0059 0003 0006 0146 0042Difference 0010 0018 0043 0002 0004 0104 0030Escalating rate 10000 2571 2688 2000 2000 2476 3623

Journal of Sensors 9

Smartphone and tablet device

Driver Fatigue Detect Warning System

Fatigue detectionOpenCV lib

Camera Facialimages

Dialer

Speaker

Fatiguecharacter

Call the police

Warning

Imaging

processing

PERCLOS index

Duration status

InvokeInvoke and run Run Invoke

Figure 10 Detection system composition

(a) (b)

Figure 11 Detection result on tablet

CvCameraViewListener to catch camera frames and aclass CascadeClassifier to detect object In our applicationwe create several CascadeClassifier classes which areimplemented as different classifiers corresponding to faceopen-eye or closed-eye When a frame is captured themethod onCameraFrame will be invoked automaticallyIn the method frame image will be translated into classMat which is represented as images in OpenCV Then themethod detectMultiScalewill be invoked by CascadeClassifierto detect face feature

Thus any Android phone or tablet with a camera dialerand speaker could transplant this system DFDWS useOpenCV library to call the camera to collect facial imagesThe image frames in the memory constitute an image streamthen the proposed algorithm would deal with it After thatthe system starts to judge driverrsquos fatigue state according toPERCLOS or duration time of closed-eye state If the driverfalls into fatigue state then the system would turn on thespeaker to remind himher or dial the emergency center

Also detection result would be shown on the screen intimeThe interface of the proposed transplanting system anddetection result are shown in Figure 11

In this paper our system runs well on a Nexus7 tabletwhich is with 1 GHz CPU and 1GB RAM In the beginning ofthe experiment the systemwould collect a period of 10min tocalculate the PERCLOS index as a baseline for judging fatiguestate During this period drivers should adjust device andbody posture to make sure that the system could accuratelyidentify faces and eyesThe index of PERCLOS escalating ratecorresponding to baseline is used to detect fatigue in smarthandheld system When PERCLOS escalating rate increasedmore than 200 the system on Nexus7 tablet will judge thedriver as fatigued Table 4 shows the detection performanceonNexus7 It demonstrates the average consumption of kindsof classifiers and also presents the shortest and longest timeof processing a single frame Compared to the detectionperformance on the PC system (see Table 2) we can find thatthe consumption time for each step is extended for Nexus7tablet Because the main frequency of the tablet is muchlower than the PC (21 GHz) accordingly the longest and theshortest processing time for single frame are also extendedFor normal situation that is the driver being almost infront face the performance of DFDWS on Nexus7 tablet canachieve the speed of 14 framessecThat means mobile device

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 3: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

Journal of Sensors 3

Figure 1 Haar characteristics

In this paper Haar features are used to train Adaboostclassifier Extraction of Haar characteristics and calculationof features are two important aspects of Adaboost algo-rithm Figure 1 shows five simplest kinds of gray rectanglefeatures There are 160000 rectangle features in a size of24 times 24 pixels which shows its complexity and diversityThe specific computing method is to subtract the numberof pixels in black rectangles from that in white rectanglesViola and Jones [21] illustrated that the speed of training anddetection will be greatly enhanced by applying integral imageto feature computation In Violarsquos method rectangle featurescan be obtained by computing several points of integralimage Integral image is defined as follows for a point (119909 119910)in an image its integral image is 119894119894(119909 119910)

119894119894 (119909 119910) = sum

1199091015840le1199091199101015840le119910

119894 (1199091015840 1199101015840) (5)

where 119894(1199091015840 1199101015840) is the gray value of the point (119909 119910)A large number of collections of target and nontarget

images are used to build positive and negative samples librarywhen training a specific target classifier

In this paper we use face images from MIT Yale andORL face databases to train the face classifiers In total 3000samples of front faces and 1500 samples of deflected faces areused and normalized into a resolution of 20 times 20

For eye detection this system also trained the eyes-open and eyes-closed classifiers with Adaboost algorithmHowever there is no public human eyes library availablefor both positive and negative samples of eyes Hence weestablished an eye library by ourselves In Figure 2 a largenumber of single eyes can be obtained through processedimages in face library They are classified into an opened-eyes library and a closed-eyes library Eventually the numbersof open-eyes and closed-eyes samples are 1190 and 700respectively with normalized resolution of 15 times 7 pixels

After establishing the eye library Adaboost algorithmconsumes most of its time on training the target classifierswhich shows a good precision and real-time performance indetection

Open-eye library

Closed-eye library

Figure 2 Face library and constructed eyes library

23 Improved Detection Strategy In this paper Adaboostalgorithm proposed by Viola is employed to train faceclassifiers and detect face However traditional methods areonly focusing on front face samples If the deflected angle ofdriverrsquos face is small the algorithm can be well applied to facedetection in this condition But classifier would be failed tocapture the target when driverrsquos face deflected in a large angle

In view of the above shortcomings we improved the facedetection stratagem based on Violarsquos method Left and rightdeflected face classifiers can be obtained by training left andright deflected face samples If the front face detection isfailed the other two deflected-face classifiers will be involvedin the detection task In the worst case there will be 3 timesof detections for deflected faces Hence if face classifiers areconstantly dealing with such kind of situation the speed offace detection will be obviously reduced

To deal with this problem our system optimizes schedul-ing algorithm of face classifiers Specifically when the systemmisses the target using the front face classifier right deflectedface classifier is called firstly to re-detect If it succeeds theright deflected-face classifier will be used in next frame bydefault Otherwise the left deflected face classifier is calledto work If detection succeeds similarly left deflected-faceclassifier will be used default in next frame The front faceclassifier is called to retest when both left and right faceclassifiers lose the target This strategy consumes a littlemore time on the detection of frames which lose targets butimproves the real-time performance of whole system

In Adaboost algorithm face detection region is obtainedby searching the whole picture The larger the picture thelonger the time of searching which will be consumed Inorder to improve the detection speed each frame of pictureis downsampled into a quarter of the original picture Inpractical application reasonable scaling of pictures has littleeffect on accuracy of detection or precision of localizationCompared to the traditional style the speed can be increasedby half or more to meet the requirement of real time

24 Eye Localization and State Recognition The commonmethods of eyes localization include integral projectionHough transition template matching and principal compo-nent analysis In this paper Adaboost algorithm is exploitedto train open-eyes and closed-eyes classifiers for locating eyes

Eye localization is carried out in the detected face regionFace is a regular geometry where each organ presents regulardistribution In order to improve precision and efficiency ofdetection face region can be segmented before locating eyes

Figure 3(a) shows face regions detected by face classifiersFace can be divided by the following analysis

4 Journal of Sensors

(a)

(b)

Figure 3 Estimation of eyes candidate region

(1) The height of eyes region can be constrained to lessthan 13 height of the face and more than 110 heightof the face

(2) The width of single eye region is less than 12 width ofthe face and more than 14 width of the face

To guarantee the eyes classifiersrsquo performance withoutthe influence of face segmentation we make the followingconstraints

(1) 119910 = 511986716(2) ℎ = 511986724

where 119910 and ℎ represent the highest pixel and the height ofestimated eyes region respectively and 119867 represents heightof face region

Figure 3(b) shows the result after partition and it illus-trates that the region in face which is below eyebrows andabove nostril can be used as eyes candidate region Inthis region eyes are detected by open-eyes and closed-eyesclassifiers This method avoids regions of nostril mouth andabove forehead that could affect human eyes detection Inpractical test it obviously reduces the error probability ofdetection

Open-eyes classifier is firstly used to detect target areaThe results will be marked by red rectangles if open-eyesstate is detected If it loses the target the closed-eyes classifierwill be switched to work And the detection result will bealso marked by yellow rectangles for closed-eye state Ifclose classifier loses target this frame will be regarded as anexception of eye detection

25 Fatigue Judgment withMultiple Indexes PERCLOS is thepercentage of duration of closed-eye state in a specific timeinterval (1min or 30 s) [22] PERCLOS is a well-recognizedand effective measure of neurophysiological fatigue level Foreach frame a result will be output at the end of detectionThe result could be open-eyes state closed-eyes state faceexception and eyes exception The RERCLOS is drawn inreal time by counting detection results with certain frames

Open-eye detectionClosed-eye detection

Face detection exceptionEye detection exception

Time

Stat

e

Figure 4 PERCLOS of human eyes state

in a fixed period as shown in Figure 4 Then fatigue can bedetected by analyzing PERCLOS and duration of closed-eyesstate

Only two eyes states (open and closed) can be detectedin this system The level of open-eyes state is not analyzedPERCLOS will be simplified based on the percentage ofduration of completely eye closed in 30 secondsThe formulaof simplified PERCLOS value is as follows

PERCLOS = 119905

30times 100 (6)

where 119905 is the duration that eyes are completely closedThe duration of closed-eyes state in driverrsquos blink will

increase when fatigue occurs In this paper the duration of12 seconds is the threshold which is used as judging fatiguestate for drivers Hence driverrsquos fatigue can be detected withtwo indexes which are PERCLOS index and the threshold of12 s duration

A driving fatigue detection system is developed by usingPERCLOS and eye states as shown in Figure 5The systemwilldetect fatigue and ring the bell when one of the two indexessatisfies the requirement

3 Experiment and Data Analysis

Our system runs on a PCwith Intel Core(TM)2Duo 210GHzCPU and 2GB RAM using the third-party library OpenCVto perform system tests in Visual Studio 2008 Racing gamewas used as driving conditions Nine subjects took partin the experiment with simulate driving environment Theresolution of these videos is 352 times 288 pixels and the framerate is 25 fps Each frame contains human face

We recorded ten groups of video streams in this experi-mentThe videos are divided into two categoriesThe first cat-egory (ie 1stndash4th groups) shows various facial expressionswhich occurred in three subjectsrsquo simulate driving experi-ments There are 1630 frames in each group of videos Thesevideos are used to test the performance of face detection andeyes detection

The second category consists of the remaining 6 groups(ie 5thndash10th) Six subjects were asked to implement drivingtasks for long enough time to become fatigued finally Thesevideos are used to extract fatigue index In order to collectvalid videos for assessment of driver fatigue subjects areinvolved in training and experimental sessions The wholeexperiment duration for one subject may last two or moredays it depends on his training performance Before experi-ment subjects are asked not to eat chocolate and drink coffeeor alcohol The length of each video is about 70 minutes

Journal of Sensors 5

Frame input

PreprocessingSuccess

Start End

Eye detection Fatigue judgment Result output

Success

FailureSecond timesface detection

First timesface detection

FailureSuccess

Failure

Set default face classifier

Is task overNo

Yes

Figure 5 System flow chart

Table 1 Experimental results

Detected target Target (frames) Correct detection (frames) Missed detection (frames) Detection rate ()Human face 6520 6214 293 95Open state 5392 4912 480 91Close state 1128 1015 113 90

There are 105000 frames in each group of videos In thefirst 55 minutes of experiment subjects are asked to takeintensive simulate driving Many curves and steep slopes arepresented in driving conditions And extra tasks of alert andvigilance (TAV) [23] were exerted in order to ensure subjectsconcentrating on driving highly during the experiment Inthe last 15 minutes we relieved subjectsrsquo stress by decliningmissions and using a flat road with fewer curves Themonotony of driving induced driver to be fatigued Obviousfeatures of fatigue for subjects in this simulate driving can besummarized as increased blink times blinks frequency andduration of closed-eyes state

31 Detection Performance of Our Method Figure 6 showsthe results of human face detection In order to distinguishdetection results the result of front face classifier was markedas a white rectangle and the result of deflected face classifierwas marked as a green rectangle The first row shows theresults which were detected by the front face classifiermarked with white rectangles Face was lost because the rightdeflection angle of face is too large in 3rd frame The secondrow shows the results which were detected by deflected faceclassifiers marked with green rectangles And in the secondrow face was lost by the deflected classifier in the first andfourth frame because the faces were in the state of lookingstraight aheadThe third row shows the results detected by thetwo classifiers Hence all faces including front and deflectedconditions were detected successfully This method can carryout real-time detection with a high accuracy

Figure 7 shows the results of eyes localization andthe state recognition under different expressions Eyes aremarked with rectangles specifically closed-eye state was

targeted with yellow rectangle And open-eye state wastargeted with red rectangle

In this section we also revealed the general performancesof the proposed method in terms of face detection eyeslocalization and state recognition We perform our methodon the first category of videos (ie 1stndash4th groups) whichcontained 6520 frames consisting of open-eye state and close-eye state The correct detected frames are manually countedDetail results are presented in Table 1 From Table 1 we canconclude that the proposed system works quite well withaccuracy of at least 90

32 Comparison Experiment Compared with the traditionalAdaboost method that trains eye classifiers to detect eyestate directly we proposed an improved strategy to detecteye state The proposed strategy mainly includes 3 stepsand 2 categories of classifiers corresponding to face andeye respectively Specifically we first detect face efficientlyby classifiers of both front face and deflected face Thencandidate region of eye is determined according to geometricdistribution of facial organs Finally trained classifiers ofopen eyes and closed eyes are used to detect eyes in thecandidate region quickly and accurately Table 2 shows thecomparison of elapsed time for 2 methods in PC systemTheshortest and longest time for processing a single frame arealso presented Concerning our method although it needsto detect both face and eye the average processing speedof the system is up to 20 fps Hence the system has a goodperformance of real time The longest processing time ofsingle frame is more than 2 times of the shortest one becauseit takes at least 2 times to detect face and eye state due tothe situation of deflected face Concerning the traditional

6 Journal of Sensors

Figure 6 Results of face detection

Figure 7 Results of eyes location and state recognition

Journal of Sensors 7

Table 2 Comparison of time consumption in PC system (ms)

Methods Mean time consumption of several detection stepsFront face Deflected face Open eye Close eye Other Single Frame (longest) Single Frame (shortest)

Our method 16 18 5 4 12 73 33Traditional method mdash mdash 96 97 mdash 245 93

Figure 8 Comparison results with our method and the traditional method The frames of top row are detected by direct Adaboost trainingeye classifier while the frames of bottom row are detected by our improved method

method in spite of detecting eye with only one step directlyit costs more time to detect eyes The reason is that it needsto use eye classifier searching in the whole frame instead of asmall candidate region as our method

Figure 8 shows comparison of detection performancebetween our method and the traditional method Figure 8illustrates that our method can detect eyes precisely inthe restricted face region However the traditional methodemployed the eye classifier to detect eye in the whole pictureAs a result there are a lot of false detections Based on thecomparison experiment we can conclude that our methodoutperforms the traditional method in terms of detectionspeed and accuracy

33 Determining the Fatigue Index In order to judge thefatigue state automatically we should fix the parameter ofthreshold to determine alert or fatigue There is an obviouschange of PERCLOS values at the beginning and the endof driving experiment The subjectrsquos blink frequency andclosed-eye state duration will increase after subjects becomefatigued Figure 9 shows PERCLOS values of six subjects inthe periods of 10sim17 (the beginning) and 63sim70 minutes (theend) From Figure 9 we can conclude that PERCLOS valuesrise significantly at the end of experiment and the escalatingrate of mean values is more than 200 Additionally sixsubjectsrsquo PERCLOS values showed individual differencesSubjectsrsquo mean values such as SUB1 SUB2 SUB4 and SUB5are relatively low (lt0025) both in the beginning and in theend of experiment But for some subjects such as SUB3 andSUB6 the PERCLOS mean values are relatively higher thanother subjects For example the PERCLOS values of SUB3and SUB6 at the beginning and the end are 0016 0042 0059and 0146 respectively

Variation of PERCLOS mean values of each subject arepresented in Table 3 In this table the PERCLOS value showsindividual differences Although the PERCLOS values ofSUB3 and SUB6 are extremely large their escalating rateskeep quite stable as the other subjects From Table 3 we findthat the escalating rate of all subjects is between 20 and 30except SUB1 Thus it should be a good choice to treat theescalating rate of PERCLOS value as an index for evaluatingdriver fatigue In this paper we judge the driver as fatiguedwhen his PERCLOS increment rate is more than 2

4 Fatigue Detection on Smart Device

Nowadays smartphones and tablet devices have quite pow-erful processing capacity and most of these devices arealso equipped with high resolution camera (front or rear)Such kinds of smart devices are installed with operatingsystem such as Android and iOS Especially for Androidoperating system it is a free and open system hence itis very convenient to do secondary development for usersFurthermore OpenCV is compatible with these Android-based devicesTherefore it is feasible to transplant our fatiguedetection system based onmachine vision into smart devicesAs a result we designed aDriving FatigueDetectionWarningSystem (DFDWS) on the Android smart device withmachinevision The composition of detection system is shown inFigure 10

This system is developed based on Android OS andOpenCV Library which are called OpenCV4AndroidOpenCV4Android is available as a java lib on the InternetIt is easy to install and configure OpenCV4AndroidSDK on the smart handheld device according to thetutorial OpenCV4Android provides an interface named

8 Journal of Sensors

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

times10minus3

0 1 2 3 4 5 6 70

0005

001

0015

002

0025

PERC

LOS

Sub 1

0

001

002

003

004

005

006

007

PERC

LOS

Sub 5

0

002

004

006

008

01

012

014

016

PERC

LOS

Sub 3

0

1

2

3

4

5

6

7

8

PERC

LOS

Sub 2

0

0005

001

0015

Time (min) Time (min)Time (min)

Time (min) Time (min)Time (min)

PERC

LOS

Sub 4

0

005

01

015

02

025

03

035

PERC

LOS

Sub 6

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

Figure 9 PERCLOS values in the beginning (10sim17m) and the end (63sim70m) of driving experiment

Table 3 Mean values of PERCLOS in different stages in 6 subjects

PERCLOS SubjectsSub 1 Sub 2 Sub 3 Sub 4 Sub 5 Sub 6 Mean

10sim17m (at the beginning) 0001 0007 0016 0001 0002 0042 001263sim70m (at the end) 0011 0025 0059 0003 0006 0146 0042Difference 0010 0018 0043 0002 0004 0104 0030Escalating rate 10000 2571 2688 2000 2000 2476 3623

Journal of Sensors 9

Smartphone and tablet device

Driver Fatigue Detect Warning System

Fatigue detectionOpenCV lib

Camera Facialimages

Dialer

Speaker

Fatiguecharacter

Call the police

Warning

Imaging

processing

PERCLOS index

Duration status

InvokeInvoke and run Run Invoke

Figure 10 Detection system composition

(a) (b)

Figure 11 Detection result on tablet

CvCameraViewListener to catch camera frames and aclass CascadeClassifier to detect object In our applicationwe create several CascadeClassifier classes which areimplemented as different classifiers corresponding to faceopen-eye or closed-eye When a frame is captured themethod onCameraFrame will be invoked automaticallyIn the method frame image will be translated into classMat which is represented as images in OpenCV Then themethod detectMultiScalewill be invoked by CascadeClassifierto detect face feature

Thus any Android phone or tablet with a camera dialerand speaker could transplant this system DFDWS useOpenCV library to call the camera to collect facial imagesThe image frames in the memory constitute an image streamthen the proposed algorithm would deal with it After thatthe system starts to judge driverrsquos fatigue state according toPERCLOS or duration time of closed-eye state If the driverfalls into fatigue state then the system would turn on thespeaker to remind himher or dial the emergency center

Also detection result would be shown on the screen intimeThe interface of the proposed transplanting system anddetection result are shown in Figure 11

In this paper our system runs well on a Nexus7 tabletwhich is with 1 GHz CPU and 1GB RAM In the beginning ofthe experiment the systemwould collect a period of 10min tocalculate the PERCLOS index as a baseline for judging fatiguestate During this period drivers should adjust device andbody posture to make sure that the system could accuratelyidentify faces and eyesThe index of PERCLOS escalating ratecorresponding to baseline is used to detect fatigue in smarthandheld system When PERCLOS escalating rate increasedmore than 200 the system on Nexus7 tablet will judge thedriver as fatigued Table 4 shows the detection performanceonNexus7 It demonstrates the average consumption of kindsof classifiers and also presents the shortest and longest timeof processing a single frame Compared to the detectionperformance on the PC system (see Table 2) we can find thatthe consumption time for each step is extended for Nexus7tablet Because the main frequency of the tablet is muchlower than the PC (21 GHz) accordingly the longest and theshortest processing time for single frame are also extendedFor normal situation that is the driver being almost infront face the performance of DFDWS on Nexus7 tablet canachieve the speed of 14 framessecThat means mobile device

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 4: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

4 Journal of Sensors

(a)

(b)

Figure 3 Estimation of eyes candidate region

(1) The height of eyes region can be constrained to lessthan 13 height of the face and more than 110 heightof the face

(2) The width of single eye region is less than 12 width ofthe face and more than 14 width of the face

To guarantee the eyes classifiersrsquo performance withoutthe influence of face segmentation we make the followingconstraints

(1) 119910 = 511986716(2) ℎ = 511986724

where 119910 and ℎ represent the highest pixel and the height ofestimated eyes region respectively and 119867 represents heightof face region

Figure 3(b) shows the result after partition and it illus-trates that the region in face which is below eyebrows andabove nostril can be used as eyes candidate region Inthis region eyes are detected by open-eyes and closed-eyesclassifiers This method avoids regions of nostril mouth andabove forehead that could affect human eyes detection Inpractical test it obviously reduces the error probability ofdetection

Open-eyes classifier is firstly used to detect target areaThe results will be marked by red rectangles if open-eyesstate is detected If it loses the target the closed-eyes classifierwill be switched to work And the detection result will bealso marked by yellow rectangles for closed-eye state Ifclose classifier loses target this frame will be regarded as anexception of eye detection

25 Fatigue Judgment withMultiple Indexes PERCLOS is thepercentage of duration of closed-eye state in a specific timeinterval (1min or 30 s) [22] PERCLOS is a well-recognizedand effective measure of neurophysiological fatigue level Foreach frame a result will be output at the end of detectionThe result could be open-eyes state closed-eyes state faceexception and eyes exception The RERCLOS is drawn inreal time by counting detection results with certain frames

Open-eye detectionClosed-eye detection

Face detection exceptionEye detection exception

Time

Stat

e

Figure 4 PERCLOS of human eyes state

in a fixed period as shown in Figure 4 Then fatigue can bedetected by analyzing PERCLOS and duration of closed-eyesstate

Only two eyes states (open and closed) can be detectedin this system The level of open-eyes state is not analyzedPERCLOS will be simplified based on the percentage ofduration of completely eye closed in 30 secondsThe formulaof simplified PERCLOS value is as follows

PERCLOS = 119905

30times 100 (6)

where 119905 is the duration that eyes are completely closedThe duration of closed-eyes state in driverrsquos blink will

increase when fatigue occurs In this paper the duration of12 seconds is the threshold which is used as judging fatiguestate for drivers Hence driverrsquos fatigue can be detected withtwo indexes which are PERCLOS index and the threshold of12 s duration

A driving fatigue detection system is developed by usingPERCLOS and eye states as shown in Figure 5The systemwilldetect fatigue and ring the bell when one of the two indexessatisfies the requirement

3 Experiment and Data Analysis

Our system runs on a PCwith Intel Core(TM)2Duo 210GHzCPU and 2GB RAM using the third-party library OpenCVto perform system tests in Visual Studio 2008 Racing gamewas used as driving conditions Nine subjects took partin the experiment with simulate driving environment Theresolution of these videos is 352 times 288 pixels and the framerate is 25 fps Each frame contains human face

We recorded ten groups of video streams in this experi-mentThe videos are divided into two categoriesThe first cat-egory (ie 1stndash4th groups) shows various facial expressionswhich occurred in three subjectsrsquo simulate driving experi-ments There are 1630 frames in each group of videos Thesevideos are used to test the performance of face detection andeyes detection

The second category consists of the remaining 6 groups(ie 5thndash10th) Six subjects were asked to implement drivingtasks for long enough time to become fatigued finally Thesevideos are used to extract fatigue index In order to collectvalid videos for assessment of driver fatigue subjects areinvolved in training and experimental sessions The wholeexperiment duration for one subject may last two or moredays it depends on his training performance Before experi-ment subjects are asked not to eat chocolate and drink coffeeor alcohol The length of each video is about 70 minutes

Journal of Sensors 5

Frame input

PreprocessingSuccess

Start End

Eye detection Fatigue judgment Result output

Success

FailureSecond timesface detection

First timesface detection

FailureSuccess

Failure

Set default face classifier

Is task overNo

Yes

Figure 5 System flow chart

Table 1 Experimental results

Detected target Target (frames) Correct detection (frames) Missed detection (frames) Detection rate ()Human face 6520 6214 293 95Open state 5392 4912 480 91Close state 1128 1015 113 90

There are 105000 frames in each group of videos In thefirst 55 minutes of experiment subjects are asked to takeintensive simulate driving Many curves and steep slopes arepresented in driving conditions And extra tasks of alert andvigilance (TAV) [23] were exerted in order to ensure subjectsconcentrating on driving highly during the experiment Inthe last 15 minutes we relieved subjectsrsquo stress by decliningmissions and using a flat road with fewer curves Themonotony of driving induced driver to be fatigued Obviousfeatures of fatigue for subjects in this simulate driving can besummarized as increased blink times blinks frequency andduration of closed-eyes state

31 Detection Performance of Our Method Figure 6 showsthe results of human face detection In order to distinguishdetection results the result of front face classifier was markedas a white rectangle and the result of deflected face classifierwas marked as a green rectangle The first row shows theresults which were detected by the front face classifiermarked with white rectangles Face was lost because the rightdeflection angle of face is too large in 3rd frame The secondrow shows the results which were detected by deflected faceclassifiers marked with green rectangles And in the secondrow face was lost by the deflected classifier in the first andfourth frame because the faces were in the state of lookingstraight aheadThe third row shows the results detected by thetwo classifiers Hence all faces including front and deflectedconditions were detected successfully This method can carryout real-time detection with a high accuracy

Figure 7 shows the results of eyes localization andthe state recognition under different expressions Eyes aremarked with rectangles specifically closed-eye state was

targeted with yellow rectangle And open-eye state wastargeted with red rectangle

In this section we also revealed the general performancesof the proposed method in terms of face detection eyeslocalization and state recognition We perform our methodon the first category of videos (ie 1stndash4th groups) whichcontained 6520 frames consisting of open-eye state and close-eye state The correct detected frames are manually countedDetail results are presented in Table 1 From Table 1 we canconclude that the proposed system works quite well withaccuracy of at least 90

32 Comparison Experiment Compared with the traditionalAdaboost method that trains eye classifiers to detect eyestate directly we proposed an improved strategy to detecteye state The proposed strategy mainly includes 3 stepsand 2 categories of classifiers corresponding to face andeye respectively Specifically we first detect face efficientlyby classifiers of both front face and deflected face Thencandidate region of eye is determined according to geometricdistribution of facial organs Finally trained classifiers ofopen eyes and closed eyes are used to detect eyes in thecandidate region quickly and accurately Table 2 shows thecomparison of elapsed time for 2 methods in PC systemTheshortest and longest time for processing a single frame arealso presented Concerning our method although it needsto detect both face and eye the average processing speedof the system is up to 20 fps Hence the system has a goodperformance of real time The longest processing time ofsingle frame is more than 2 times of the shortest one becauseit takes at least 2 times to detect face and eye state due tothe situation of deflected face Concerning the traditional

6 Journal of Sensors

Figure 6 Results of face detection

Figure 7 Results of eyes location and state recognition

Journal of Sensors 7

Table 2 Comparison of time consumption in PC system (ms)

Methods Mean time consumption of several detection stepsFront face Deflected face Open eye Close eye Other Single Frame (longest) Single Frame (shortest)

Our method 16 18 5 4 12 73 33Traditional method mdash mdash 96 97 mdash 245 93

Figure 8 Comparison results with our method and the traditional method The frames of top row are detected by direct Adaboost trainingeye classifier while the frames of bottom row are detected by our improved method

method in spite of detecting eye with only one step directlyit costs more time to detect eyes The reason is that it needsto use eye classifier searching in the whole frame instead of asmall candidate region as our method

Figure 8 shows comparison of detection performancebetween our method and the traditional method Figure 8illustrates that our method can detect eyes precisely inthe restricted face region However the traditional methodemployed the eye classifier to detect eye in the whole pictureAs a result there are a lot of false detections Based on thecomparison experiment we can conclude that our methodoutperforms the traditional method in terms of detectionspeed and accuracy

33 Determining the Fatigue Index In order to judge thefatigue state automatically we should fix the parameter ofthreshold to determine alert or fatigue There is an obviouschange of PERCLOS values at the beginning and the endof driving experiment The subjectrsquos blink frequency andclosed-eye state duration will increase after subjects becomefatigued Figure 9 shows PERCLOS values of six subjects inthe periods of 10sim17 (the beginning) and 63sim70 minutes (theend) From Figure 9 we can conclude that PERCLOS valuesrise significantly at the end of experiment and the escalatingrate of mean values is more than 200 Additionally sixsubjectsrsquo PERCLOS values showed individual differencesSubjectsrsquo mean values such as SUB1 SUB2 SUB4 and SUB5are relatively low (lt0025) both in the beginning and in theend of experiment But for some subjects such as SUB3 andSUB6 the PERCLOS mean values are relatively higher thanother subjects For example the PERCLOS values of SUB3and SUB6 at the beginning and the end are 0016 0042 0059and 0146 respectively

Variation of PERCLOS mean values of each subject arepresented in Table 3 In this table the PERCLOS value showsindividual differences Although the PERCLOS values ofSUB3 and SUB6 are extremely large their escalating rateskeep quite stable as the other subjects From Table 3 we findthat the escalating rate of all subjects is between 20 and 30except SUB1 Thus it should be a good choice to treat theescalating rate of PERCLOS value as an index for evaluatingdriver fatigue In this paper we judge the driver as fatiguedwhen his PERCLOS increment rate is more than 2

4 Fatigue Detection on Smart Device

Nowadays smartphones and tablet devices have quite pow-erful processing capacity and most of these devices arealso equipped with high resolution camera (front or rear)Such kinds of smart devices are installed with operatingsystem such as Android and iOS Especially for Androidoperating system it is a free and open system hence itis very convenient to do secondary development for usersFurthermore OpenCV is compatible with these Android-based devicesTherefore it is feasible to transplant our fatiguedetection system based onmachine vision into smart devicesAs a result we designed aDriving FatigueDetectionWarningSystem (DFDWS) on the Android smart device withmachinevision The composition of detection system is shown inFigure 10

This system is developed based on Android OS andOpenCV Library which are called OpenCV4AndroidOpenCV4Android is available as a java lib on the InternetIt is easy to install and configure OpenCV4AndroidSDK on the smart handheld device according to thetutorial OpenCV4Android provides an interface named

8 Journal of Sensors

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

times10minus3

0 1 2 3 4 5 6 70

0005

001

0015

002

0025

PERC

LOS

Sub 1

0

001

002

003

004

005

006

007

PERC

LOS

Sub 5

0

002

004

006

008

01

012

014

016

PERC

LOS

Sub 3

0

1

2

3

4

5

6

7

8

PERC

LOS

Sub 2

0

0005

001

0015

Time (min) Time (min)Time (min)

Time (min) Time (min)Time (min)

PERC

LOS

Sub 4

0

005

01

015

02

025

03

035

PERC

LOS

Sub 6

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

Figure 9 PERCLOS values in the beginning (10sim17m) and the end (63sim70m) of driving experiment

Table 3 Mean values of PERCLOS in different stages in 6 subjects

PERCLOS SubjectsSub 1 Sub 2 Sub 3 Sub 4 Sub 5 Sub 6 Mean

10sim17m (at the beginning) 0001 0007 0016 0001 0002 0042 001263sim70m (at the end) 0011 0025 0059 0003 0006 0146 0042Difference 0010 0018 0043 0002 0004 0104 0030Escalating rate 10000 2571 2688 2000 2000 2476 3623

Journal of Sensors 9

Smartphone and tablet device

Driver Fatigue Detect Warning System

Fatigue detectionOpenCV lib

Camera Facialimages

Dialer

Speaker

Fatiguecharacter

Call the police

Warning

Imaging

processing

PERCLOS index

Duration status

InvokeInvoke and run Run Invoke

Figure 10 Detection system composition

(a) (b)

Figure 11 Detection result on tablet

CvCameraViewListener to catch camera frames and aclass CascadeClassifier to detect object In our applicationwe create several CascadeClassifier classes which areimplemented as different classifiers corresponding to faceopen-eye or closed-eye When a frame is captured themethod onCameraFrame will be invoked automaticallyIn the method frame image will be translated into classMat which is represented as images in OpenCV Then themethod detectMultiScalewill be invoked by CascadeClassifierto detect face feature

Thus any Android phone or tablet with a camera dialerand speaker could transplant this system DFDWS useOpenCV library to call the camera to collect facial imagesThe image frames in the memory constitute an image streamthen the proposed algorithm would deal with it After thatthe system starts to judge driverrsquos fatigue state according toPERCLOS or duration time of closed-eye state If the driverfalls into fatigue state then the system would turn on thespeaker to remind himher or dial the emergency center

Also detection result would be shown on the screen intimeThe interface of the proposed transplanting system anddetection result are shown in Figure 11

In this paper our system runs well on a Nexus7 tabletwhich is with 1 GHz CPU and 1GB RAM In the beginning ofthe experiment the systemwould collect a period of 10min tocalculate the PERCLOS index as a baseline for judging fatiguestate During this period drivers should adjust device andbody posture to make sure that the system could accuratelyidentify faces and eyesThe index of PERCLOS escalating ratecorresponding to baseline is used to detect fatigue in smarthandheld system When PERCLOS escalating rate increasedmore than 200 the system on Nexus7 tablet will judge thedriver as fatigued Table 4 shows the detection performanceonNexus7 It demonstrates the average consumption of kindsof classifiers and also presents the shortest and longest timeof processing a single frame Compared to the detectionperformance on the PC system (see Table 2) we can find thatthe consumption time for each step is extended for Nexus7tablet Because the main frequency of the tablet is muchlower than the PC (21 GHz) accordingly the longest and theshortest processing time for single frame are also extendedFor normal situation that is the driver being almost infront face the performance of DFDWS on Nexus7 tablet canachieve the speed of 14 framessecThat means mobile device

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 5: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

Journal of Sensors 5

Frame input

PreprocessingSuccess

Start End

Eye detection Fatigue judgment Result output

Success

FailureSecond timesface detection

First timesface detection

FailureSuccess

Failure

Set default face classifier

Is task overNo

Yes

Figure 5 System flow chart

Table 1 Experimental results

Detected target Target (frames) Correct detection (frames) Missed detection (frames) Detection rate ()Human face 6520 6214 293 95Open state 5392 4912 480 91Close state 1128 1015 113 90

There are 105000 frames in each group of videos In thefirst 55 minutes of experiment subjects are asked to takeintensive simulate driving Many curves and steep slopes arepresented in driving conditions And extra tasks of alert andvigilance (TAV) [23] were exerted in order to ensure subjectsconcentrating on driving highly during the experiment Inthe last 15 minutes we relieved subjectsrsquo stress by decliningmissions and using a flat road with fewer curves Themonotony of driving induced driver to be fatigued Obviousfeatures of fatigue for subjects in this simulate driving can besummarized as increased blink times blinks frequency andduration of closed-eyes state

31 Detection Performance of Our Method Figure 6 showsthe results of human face detection In order to distinguishdetection results the result of front face classifier was markedas a white rectangle and the result of deflected face classifierwas marked as a green rectangle The first row shows theresults which were detected by the front face classifiermarked with white rectangles Face was lost because the rightdeflection angle of face is too large in 3rd frame The secondrow shows the results which were detected by deflected faceclassifiers marked with green rectangles And in the secondrow face was lost by the deflected classifier in the first andfourth frame because the faces were in the state of lookingstraight aheadThe third row shows the results detected by thetwo classifiers Hence all faces including front and deflectedconditions were detected successfully This method can carryout real-time detection with a high accuracy

Figure 7 shows the results of eyes localization andthe state recognition under different expressions Eyes aremarked with rectangles specifically closed-eye state was

targeted with yellow rectangle And open-eye state wastargeted with red rectangle

In this section we also revealed the general performancesof the proposed method in terms of face detection eyeslocalization and state recognition We perform our methodon the first category of videos (ie 1stndash4th groups) whichcontained 6520 frames consisting of open-eye state and close-eye state The correct detected frames are manually countedDetail results are presented in Table 1 From Table 1 we canconclude that the proposed system works quite well withaccuracy of at least 90

32 Comparison Experiment Compared with the traditionalAdaboost method that trains eye classifiers to detect eyestate directly we proposed an improved strategy to detecteye state The proposed strategy mainly includes 3 stepsand 2 categories of classifiers corresponding to face andeye respectively Specifically we first detect face efficientlyby classifiers of both front face and deflected face Thencandidate region of eye is determined according to geometricdistribution of facial organs Finally trained classifiers ofopen eyes and closed eyes are used to detect eyes in thecandidate region quickly and accurately Table 2 shows thecomparison of elapsed time for 2 methods in PC systemTheshortest and longest time for processing a single frame arealso presented Concerning our method although it needsto detect both face and eye the average processing speedof the system is up to 20 fps Hence the system has a goodperformance of real time The longest processing time ofsingle frame is more than 2 times of the shortest one becauseit takes at least 2 times to detect face and eye state due tothe situation of deflected face Concerning the traditional

6 Journal of Sensors

Figure 6 Results of face detection

Figure 7 Results of eyes location and state recognition

Journal of Sensors 7

Table 2 Comparison of time consumption in PC system (ms)

Methods Mean time consumption of several detection stepsFront face Deflected face Open eye Close eye Other Single Frame (longest) Single Frame (shortest)

Our method 16 18 5 4 12 73 33Traditional method mdash mdash 96 97 mdash 245 93

Figure 8 Comparison results with our method and the traditional method The frames of top row are detected by direct Adaboost trainingeye classifier while the frames of bottom row are detected by our improved method

method in spite of detecting eye with only one step directlyit costs more time to detect eyes The reason is that it needsto use eye classifier searching in the whole frame instead of asmall candidate region as our method

Figure 8 shows comparison of detection performancebetween our method and the traditional method Figure 8illustrates that our method can detect eyes precisely inthe restricted face region However the traditional methodemployed the eye classifier to detect eye in the whole pictureAs a result there are a lot of false detections Based on thecomparison experiment we can conclude that our methodoutperforms the traditional method in terms of detectionspeed and accuracy

33 Determining the Fatigue Index In order to judge thefatigue state automatically we should fix the parameter ofthreshold to determine alert or fatigue There is an obviouschange of PERCLOS values at the beginning and the endof driving experiment The subjectrsquos blink frequency andclosed-eye state duration will increase after subjects becomefatigued Figure 9 shows PERCLOS values of six subjects inthe periods of 10sim17 (the beginning) and 63sim70 minutes (theend) From Figure 9 we can conclude that PERCLOS valuesrise significantly at the end of experiment and the escalatingrate of mean values is more than 200 Additionally sixsubjectsrsquo PERCLOS values showed individual differencesSubjectsrsquo mean values such as SUB1 SUB2 SUB4 and SUB5are relatively low (lt0025) both in the beginning and in theend of experiment But for some subjects such as SUB3 andSUB6 the PERCLOS mean values are relatively higher thanother subjects For example the PERCLOS values of SUB3and SUB6 at the beginning and the end are 0016 0042 0059and 0146 respectively

Variation of PERCLOS mean values of each subject arepresented in Table 3 In this table the PERCLOS value showsindividual differences Although the PERCLOS values ofSUB3 and SUB6 are extremely large their escalating rateskeep quite stable as the other subjects From Table 3 we findthat the escalating rate of all subjects is between 20 and 30except SUB1 Thus it should be a good choice to treat theescalating rate of PERCLOS value as an index for evaluatingdriver fatigue In this paper we judge the driver as fatiguedwhen his PERCLOS increment rate is more than 2

4 Fatigue Detection on Smart Device

Nowadays smartphones and tablet devices have quite pow-erful processing capacity and most of these devices arealso equipped with high resolution camera (front or rear)Such kinds of smart devices are installed with operatingsystem such as Android and iOS Especially for Androidoperating system it is a free and open system hence itis very convenient to do secondary development for usersFurthermore OpenCV is compatible with these Android-based devicesTherefore it is feasible to transplant our fatiguedetection system based onmachine vision into smart devicesAs a result we designed aDriving FatigueDetectionWarningSystem (DFDWS) on the Android smart device withmachinevision The composition of detection system is shown inFigure 10

This system is developed based on Android OS andOpenCV Library which are called OpenCV4AndroidOpenCV4Android is available as a java lib on the InternetIt is easy to install and configure OpenCV4AndroidSDK on the smart handheld device according to thetutorial OpenCV4Android provides an interface named

8 Journal of Sensors

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

times10minus3

0 1 2 3 4 5 6 70

0005

001

0015

002

0025

PERC

LOS

Sub 1

0

001

002

003

004

005

006

007

PERC

LOS

Sub 5

0

002

004

006

008

01

012

014

016

PERC

LOS

Sub 3

0

1

2

3

4

5

6

7

8

PERC

LOS

Sub 2

0

0005

001

0015

Time (min) Time (min)Time (min)

Time (min) Time (min)Time (min)

PERC

LOS

Sub 4

0

005

01

015

02

025

03

035

PERC

LOS

Sub 6

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

Figure 9 PERCLOS values in the beginning (10sim17m) and the end (63sim70m) of driving experiment

Table 3 Mean values of PERCLOS in different stages in 6 subjects

PERCLOS SubjectsSub 1 Sub 2 Sub 3 Sub 4 Sub 5 Sub 6 Mean

10sim17m (at the beginning) 0001 0007 0016 0001 0002 0042 001263sim70m (at the end) 0011 0025 0059 0003 0006 0146 0042Difference 0010 0018 0043 0002 0004 0104 0030Escalating rate 10000 2571 2688 2000 2000 2476 3623

Journal of Sensors 9

Smartphone and tablet device

Driver Fatigue Detect Warning System

Fatigue detectionOpenCV lib

Camera Facialimages

Dialer

Speaker

Fatiguecharacter

Call the police

Warning

Imaging

processing

PERCLOS index

Duration status

InvokeInvoke and run Run Invoke

Figure 10 Detection system composition

(a) (b)

Figure 11 Detection result on tablet

CvCameraViewListener to catch camera frames and aclass CascadeClassifier to detect object In our applicationwe create several CascadeClassifier classes which areimplemented as different classifiers corresponding to faceopen-eye or closed-eye When a frame is captured themethod onCameraFrame will be invoked automaticallyIn the method frame image will be translated into classMat which is represented as images in OpenCV Then themethod detectMultiScalewill be invoked by CascadeClassifierto detect face feature

Thus any Android phone or tablet with a camera dialerand speaker could transplant this system DFDWS useOpenCV library to call the camera to collect facial imagesThe image frames in the memory constitute an image streamthen the proposed algorithm would deal with it After thatthe system starts to judge driverrsquos fatigue state according toPERCLOS or duration time of closed-eye state If the driverfalls into fatigue state then the system would turn on thespeaker to remind himher or dial the emergency center

Also detection result would be shown on the screen intimeThe interface of the proposed transplanting system anddetection result are shown in Figure 11

In this paper our system runs well on a Nexus7 tabletwhich is with 1 GHz CPU and 1GB RAM In the beginning ofthe experiment the systemwould collect a period of 10min tocalculate the PERCLOS index as a baseline for judging fatiguestate During this period drivers should adjust device andbody posture to make sure that the system could accuratelyidentify faces and eyesThe index of PERCLOS escalating ratecorresponding to baseline is used to detect fatigue in smarthandheld system When PERCLOS escalating rate increasedmore than 200 the system on Nexus7 tablet will judge thedriver as fatigued Table 4 shows the detection performanceonNexus7 It demonstrates the average consumption of kindsof classifiers and also presents the shortest and longest timeof processing a single frame Compared to the detectionperformance on the PC system (see Table 2) we can find thatthe consumption time for each step is extended for Nexus7tablet Because the main frequency of the tablet is muchlower than the PC (21 GHz) accordingly the longest and theshortest processing time for single frame are also extendedFor normal situation that is the driver being almost infront face the performance of DFDWS on Nexus7 tablet canachieve the speed of 14 framessecThat means mobile device

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 6: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

6 Journal of Sensors

Figure 6 Results of face detection

Figure 7 Results of eyes location and state recognition

Journal of Sensors 7

Table 2 Comparison of time consumption in PC system (ms)

Methods Mean time consumption of several detection stepsFront face Deflected face Open eye Close eye Other Single Frame (longest) Single Frame (shortest)

Our method 16 18 5 4 12 73 33Traditional method mdash mdash 96 97 mdash 245 93

Figure 8 Comparison results with our method and the traditional method The frames of top row are detected by direct Adaboost trainingeye classifier while the frames of bottom row are detected by our improved method

method in spite of detecting eye with only one step directlyit costs more time to detect eyes The reason is that it needsto use eye classifier searching in the whole frame instead of asmall candidate region as our method

Figure 8 shows comparison of detection performancebetween our method and the traditional method Figure 8illustrates that our method can detect eyes precisely inthe restricted face region However the traditional methodemployed the eye classifier to detect eye in the whole pictureAs a result there are a lot of false detections Based on thecomparison experiment we can conclude that our methodoutperforms the traditional method in terms of detectionspeed and accuracy

33 Determining the Fatigue Index In order to judge thefatigue state automatically we should fix the parameter ofthreshold to determine alert or fatigue There is an obviouschange of PERCLOS values at the beginning and the endof driving experiment The subjectrsquos blink frequency andclosed-eye state duration will increase after subjects becomefatigued Figure 9 shows PERCLOS values of six subjects inthe periods of 10sim17 (the beginning) and 63sim70 minutes (theend) From Figure 9 we can conclude that PERCLOS valuesrise significantly at the end of experiment and the escalatingrate of mean values is more than 200 Additionally sixsubjectsrsquo PERCLOS values showed individual differencesSubjectsrsquo mean values such as SUB1 SUB2 SUB4 and SUB5are relatively low (lt0025) both in the beginning and in theend of experiment But for some subjects such as SUB3 andSUB6 the PERCLOS mean values are relatively higher thanother subjects For example the PERCLOS values of SUB3and SUB6 at the beginning and the end are 0016 0042 0059and 0146 respectively

Variation of PERCLOS mean values of each subject arepresented in Table 3 In this table the PERCLOS value showsindividual differences Although the PERCLOS values ofSUB3 and SUB6 are extremely large their escalating rateskeep quite stable as the other subjects From Table 3 we findthat the escalating rate of all subjects is between 20 and 30except SUB1 Thus it should be a good choice to treat theescalating rate of PERCLOS value as an index for evaluatingdriver fatigue In this paper we judge the driver as fatiguedwhen his PERCLOS increment rate is more than 2

4 Fatigue Detection on Smart Device

Nowadays smartphones and tablet devices have quite pow-erful processing capacity and most of these devices arealso equipped with high resolution camera (front or rear)Such kinds of smart devices are installed with operatingsystem such as Android and iOS Especially for Androidoperating system it is a free and open system hence itis very convenient to do secondary development for usersFurthermore OpenCV is compatible with these Android-based devicesTherefore it is feasible to transplant our fatiguedetection system based onmachine vision into smart devicesAs a result we designed aDriving FatigueDetectionWarningSystem (DFDWS) on the Android smart device withmachinevision The composition of detection system is shown inFigure 10

This system is developed based on Android OS andOpenCV Library which are called OpenCV4AndroidOpenCV4Android is available as a java lib on the InternetIt is easy to install and configure OpenCV4AndroidSDK on the smart handheld device according to thetutorial OpenCV4Android provides an interface named

8 Journal of Sensors

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

times10minus3

0 1 2 3 4 5 6 70

0005

001

0015

002

0025

PERC

LOS

Sub 1

0

001

002

003

004

005

006

007

PERC

LOS

Sub 5

0

002

004

006

008

01

012

014

016

PERC

LOS

Sub 3

0

1

2

3

4

5

6

7

8

PERC

LOS

Sub 2

0

0005

001

0015

Time (min) Time (min)Time (min)

Time (min) Time (min)Time (min)

PERC

LOS

Sub 4

0

005

01

015

02

025

03

035

PERC

LOS

Sub 6

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

Figure 9 PERCLOS values in the beginning (10sim17m) and the end (63sim70m) of driving experiment

Table 3 Mean values of PERCLOS in different stages in 6 subjects

PERCLOS SubjectsSub 1 Sub 2 Sub 3 Sub 4 Sub 5 Sub 6 Mean

10sim17m (at the beginning) 0001 0007 0016 0001 0002 0042 001263sim70m (at the end) 0011 0025 0059 0003 0006 0146 0042Difference 0010 0018 0043 0002 0004 0104 0030Escalating rate 10000 2571 2688 2000 2000 2476 3623

Journal of Sensors 9

Smartphone and tablet device

Driver Fatigue Detect Warning System

Fatigue detectionOpenCV lib

Camera Facialimages

Dialer

Speaker

Fatiguecharacter

Call the police

Warning

Imaging

processing

PERCLOS index

Duration status

InvokeInvoke and run Run Invoke

Figure 10 Detection system composition

(a) (b)

Figure 11 Detection result on tablet

CvCameraViewListener to catch camera frames and aclass CascadeClassifier to detect object In our applicationwe create several CascadeClassifier classes which areimplemented as different classifiers corresponding to faceopen-eye or closed-eye When a frame is captured themethod onCameraFrame will be invoked automaticallyIn the method frame image will be translated into classMat which is represented as images in OpenCV Then themethod detectMultiScalewill be invoked by CascadeClassifierto detect face feature

Thus any Android phone or tablet with a camera dialerand speaker could transplant this system DFDWS useOpenCV library to call the camera to collect facial imagesThe image frames in the memory constitute an image streamthen the proposed algorithm would deal with it After thatthe system starts to judge driverrsquos fatigue state according toPERCLOS or duration time of closed-eye state If the driverfalls into fatigue state then the system would turn on thespeaker to remind himher or dial the emergency center

Also detection result would be shown on the screen intimeThe interface of the proposed transplanting system anddetection result are shown in Figure 11

In this paper our system runs well on a Nexus7 tabletwhich is with 1 GHz CPU and 1GB RAM In the beginning ofthe experiment the systemwould collect a period of 10min tocalculate the PERCLOS index as a baseline for judging fatiguestate During this period drivers should adjust device andbody posture to make sure that the system could accuratelyidentify faces and eyesThe index of PERCLOS escalating ratecorresponding to baseline is used to detect fatigue in smarthandheld system When PERCLOS escalating rate increasedmore than 200 the system on Nexus7 tablet will judge thedriver as fatigued Table 4 shows the detection performanceonNexus7 It demonstrates the average consumption of kindsof classifiers and also presents the shortest and longest timeof processing a single frame Compared to the detectionperformance on the PC system (see Table 2) we can find thatthe consumption time for each step is extended for Nexus7tablet Because the main frequency of the tablet is muchlower than the PC (21 GHz) accordingly the longest and theshortest processing time for single frame are also extendedFor normal situation that is the driver being almost infront face the performance of DFDWS on Nexus7 tablet canachieve the speed of 14 framessecThat means mobile device

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 7: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

Journal of Sensors 7

Table 2 Comparison of time consumption in PC system (ms)

Methods Mean time consumption of several detection stepsFront face Deflected face Open eye Close eye Other Single Frame (longest) Single Frame (shortest)

Our method 16 18 5 4 12 73 33Traditional method mdash mdash 96 97 mdash 245 93

Figure 8 Comparison results with our method and the traditional method The frames of top row are detected by direct Adaboost trainingeye classifier while the frames of bottom row are detected by our improved method

method in spite of detecting eye with only one step directlyit costs more time to detect eyes The reason is that it needsto use eye classifier searching in the whole frame instead of asmall candidate region as our method

Figure 8 shows comparison of detection performancebetween our method and the traditional method Figure 8illustrates that our method can detect eyes precisely inthe restricted face region However the traditional methodemployed the eye classifier to detect eye in the whole pictureAs a result there are a lot of false detections Based on thecomparison experiment we can conclude that our methodoutperforms the traditional method in terms of detectionspeed and accuracy

33 Determining the Fatigue Index In order to judge thefatigue state automatically we should fix the parameter ofthreshold to determine alert or fatigue There is an obviouschange of PERCLOS values at the beginning and the endof driving experiment The subjectrsquos blink frequency andclosed-eye state duration will increase after subjects becomefatigued Figure 9 shows PERCLOS values of six subjects inthe periods of 10sim17 (the beginning) and 63sim70 minutes (theend) From Figure 9 we can conclude that PERCLOS valuesrise significantly at the end of experiment and the escalatingrate of mean values is more than 200 Additionally sixsubjectsrsquo PERCLOS values showed individual differencesSubjectsrsquo mean values such as SUB1 SUB2 SUB4 and SUB5are relatively low (lt0025) both in the beginning and in theend of experiment But for some subjects such as SUB3 andSUB6 the PERCLOS mean values are relatively higher thanother subjects For example the PERCLOS values of SUB3and SUB6 at the beginning and the end are 0016 0042 0059and 0146 respectively

Variation of PERCLOS mean values of each subject arepresented in Table 3 In this table the PERCLOS value showsindividual differences Although the PERCLOS values ofSUB3 and SUB6 are extremely large their escalating rateskeep quite stable as the other subjects From Table 3 we findthat the escalating rate of all subjects is between 20 and 30except SUB1 Thus it should be a good choice to treat theescalating rate of PERCLOS value as an index for evaluatingdriver fatigue In this paper we judge the driver as fatiguedwhen his PERCLOS increment rate is more than 2

4 Fatigue Detection on Smart Device

Nowadays smartphones and tablet devices have quite pow-erful processing capacity and most of these devices arealso equipped with high resolution camera (front or rear)Such kinds of smart devices are installed with operatingsystem such as Android and iOS Especially for Androidoperating system it is a free and open system hence itis very convenient to do secondary development for usersFurthermore OpenCV is compatible with these Android-based devicesTherefore it is feasible to transplant our fatiguedetection system based onmachine vision into smart devicesAs a result we designed aDriving FatigueDetectionWarningSystem (DFDWS) on the Android smart device withmachinevision The composition of detection system is shown inFigure 10

This system is developed based on Android OS andOpenCV Library which are called OpenCV4AndroidOpenCV4Android is available as a java lib on the InternetIt is easy to install and configure OpenCV4AndroidSDK on the smart handheld device according to thetutorial OpenCV4Android provides an interface named

8 Journal of Sensors

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

times10minus3

0 1 2 3 4 5 6 70

0005

001

0015

002

0025

PERC

LOS

Sub 1

0

001

002

003

004

005

006

007

PERC

LOS

Sub 5

0

002

004

006

008

01

012

014

016

PERC

LOS

Sub 3

0

1

2

3

4

5

6

7

8

PERC

LOS

Sub 2

0

0005

001

0015

Time (min) Time (min)Time (min)

Time (min) Time (min)Time (min)

PERC

LOS

Sub 4

0

005

01

015

02

025

03

035

PERC

LOS

Sub 6

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

Figure 9 PERCLOS values in the beginning (10sim17m) and the end (63sim70m) of driving experiment

Table 3 Mean values of PERCLOS in different stages in 6 subjects

PERCLOS SubjectsSub 1 Sub 2 Sub 3 Sub 4 Sub 5 Sub 6 Mean

10sim17m (at the beginning) 0001 0007 0016 0001 0002 0042 001263sim70m (at the end) 0011 0025 0059 0003 0006 0146 0042Difference 0010 0018 0043 0002 0004 0104 0030Escalating rate 10000 2571 2688 2000 2000 2476 3623

Journal of Sensors 9

Smartphone and tablet device

Driver Fatigue Detect Warning System

Fatigue detectionOpenCV lib

Camera Facialimages

Dialer

Speaker

Fatiguecharacter

Call the police

Warning

Imaging

processing

PERCLOS index

Duration status

InvokeInvoke and run Run Invoke

Figure 10 Detection system composition

(a) (b)

Figure 11 Detection result on tablet

CvCameraViewListener to catch camera frames and aclass CascadeClassifier to detect object In our applicationwe create several CascadeClassifier classes which areimplemented as different classifiers corresponding to faceopen-eye or closed-eye When a frame is captured themethod onCameraFrame will be invoked automaticallyIn the method frame image will be translated into classMat which is represented as images in OpenCV Then themethod detectMultiScalewill be invoked by CascadeClassifierto detect face feature

Thus any Android phone or tablet with a camera dialerand speaker could transplant this system DFDWS useOpenCV library to call the camera to collect facial imagesThe image frames in the memory constitute an image streamthen the proposed algorithm would deal with it After thatthe system starts to judge driverrsquos fatigue state according toPERCLOS or duration time of closed-eye state If the driverfalls into fatigue state then the system would turn on thespeaker to remind himher or dial the emergency center

Also detection result would be shown on the screen intimeThe interface of the proposed transplanting system anddetection result are shown in Figure 11

In this paper our system runs well on a Nexus7 tabletwhich is with 1 GHz CPU and 1GB RAM In the beginning ofthe experiment the systemwould collect a period of 10min tocalculate the PERCLOS index as a baseline for judging fatiguestate During this period drivers should adjust device andbody posture to make sure that the system could accuratelyidentify faces and eyesThe index of PERCLOS escalating ratecorresponding to baseline is used to detect fatigue in smarthandheld system When PERCLOS escalating rate increasedmore than 200 the system on Nexus7 tablet will judge thedriver as fatigued Table 4 shows the detection performanceonNexus7 It demonstrates the average consumption of kindsof classifiers and also presents the shortest and longest timeof processing a single frame Compared to the detectionperformance on the PC system (see Table 2) we can find thatthe consumption time for each step is extended for Nexus7tablet Because the main frequency of the tablet is muchlower than the PC (21 GHz) accordingly the longest and theshortest processing time for single frame are also extendedFor normal situation that is the driver being almost infront face the performance of DFDWS on Nexus7 tablet canachieve the speed of 14 framessecThat means mobile device

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 8: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

8 Journal of Sensors

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

10sim17

63sim70

10sim17 mean value63sim70 mean value

times10minus3

0 1 2 3 4 5 6 70

0005

001

0015

002

0025

PERC

LOS

Sub 1

0

001

002

003

004

005

006

007

PERC

LOS

Sub 5

0

002

004

006

008

01

012

014

016

PERC

LOS

Sub 3

0

1

2

3

4

5

6

7

8

PERC

LOS

Sub 2

0

0005

001

0015

Time (min) Time (min)Time (min)

Time (min) Time (min)Time (min)

PERC

LOS

Sub 4

0

005

01

015

02

025

03

035

PERC

LOS

Sub 6

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

Figure 9 PERCLOS values in the beginning (10sim17m) and the end (63sim70m) of driving experiment

Table 3 Mean values of PERCLOS in different stages in 6 subjects

PERCLOS SubjectsSub 1 Sub 2 Sub 3 Sub 4 Sub 5 Sub 6 Mean

10sim17m (at the beginning) 0001 0007 0016 0001 0002 0042 001263sim70m (at the end) 0011 0025 0059 0003 0006 0146 0042Difference 0010 0018 0043 0002 0004 0104 0030Escalating rate 10000 2571 2688 2000 2000 2476 3623

Journal of Sensors 9

Smartphone and tablet device

Driver Fatigue Detect Warning System

Fatigue detectionOpenCV lib

Camera Facialimages

Dialer

Speaker

Fatiguecharacter

Call the police

Warning

Imaging

processing

PERCLOS index

Duration status

InvokeInvoke and run Run Invoke

Figure 10 Detection system composition

(a) (b)

Figure 11 Detection result on tablet

CvCameraViewListener to catch camera frames and aclass CascadeClassifier to detect object In our applicationwe create several CascadeClassifier classes which areimplemented as different classifiers corresponding to faceopen-eye or closed-eye When a frame is captured themethod onCameraFrame will be invoked automaticallyIn the method frame image will be translated into classMat which is represented as images in OpenCV Then themethod detectMultiScalewill be invoked by CascadeClassifierto detect face feature

Thus any Android phone or tablet with a camera dialerand speaker could transplant this system DFDWS useOpenCV library to call the camera to collect facial imagesThe image frames in the memory constitute an image streamthen the proposed algorithm would deal with it After thatthe system starts to judge driverrsquos fatigue state according toPERCLOS or duration time of closed-eye state If the driverfalls into fatigue state then the system would turn on thespeaker to remind himher or dial the emergency center

Also detection result would be shown on the screen intimeThe interface of the proposed transplanting system anddetection result are shown in Figure 11

In this paper our system runs well on a Nexus7 tabletwhich is with 1 GHz CPU and 1GB RAM In the beginning ofthe experiment the systemwould collect a period of 10min tocalculate the PERCLOS index as a baseline for judging fatiguestate During this period drivers should adjust device andbody posture to make sure that the system could accuratelyidentify faces and eyesThe index of PERCLOS escalating ratecorresponding to baseline is used to detect fatigue in smarthandheld system When PERCLOS escalating rate increasedmore than 200 the system on Nexus7 tablet will judge thedriver as fatigued Table 4 shows the detection performanceonNexus7 It demonstrates the average consumption of kindsof classifiers and also presents the shortest and longest timeof processing a single frame Compared to the detectionperformance on the PC system (see Table 2) we can find thatthe consumption time for each step is extended for Nexus7tablet Because the main frequency of the tablet is muchlower than the PC (21 GHz) accordingly the longest and theshortest processing time for single frame are also extendedFor normal situation that is the driver being almost infront face the performance of DFDWS on Nexus7 tablet canachieve the speed of 14 framessecThat means mobile device

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 9: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

Journal of Sensors 9

Smartphone and tablet device

Driver Fatigue Detect Warning System

Fatigue detectionOpenCV lib

Camera Facialimages

Dialer

Speaker

Fatiguecharacter

Call the police

Warning

Imaging

processing

PERCLOS index

Duration status

InvokeInvoke and run Run Invoke

Figure 10 Detection system composition

(a) (b)

Figure 11 Detection result on tablet

CvCameraViewListener to catch camera frames and aclass CascadeClassifier to detect object In our applicationwe create several CascadeClassifier classes which areimplemented as different classifiers corresponding to faceopen-eye or closed-eye When a frame is captured themethod onCameraFrame will be invoked automaticallyIn the method frame image will be translated into classMat which is represented as images in OpenCV Then themethod detectMultiScalewill be invoked by CascadeClassifierto detect face feature

Thus any Android phone or tablet with a camera dialerand speaker could transplant this system DFDWS useOpenCV library to call the camera to collect facial imagesThe image frames in the memory constitute an image streamthen the proposed algorithm would deal with it After thatthe system starts to judge driverrsquos fatigue state according toPERCLOS or duration time of closed-eye state If the driverfalls into fatigue state then the system would turn on thespeaker to remind himher or dial the emergency center

Also detection result would be shown on the screen intimeThe interface of the proposed transplanting system anddetection result are shown in Figure 11

In this paper our system runs well on a Nexus7 tabletwhich is with 1 GHz CPU and 1GB RAM In the beginning ofthe experiment the systemwould collect a period of 10min tocalculate the PERCLOS index as a baseline for judging fatiguestate During this period drivers should adjust device andbody posture to make sure that the system could accuratelyidentify faces and eyesThe index of PERCLOS escalating ratecorresponding to baseline is used to detect fatigue in smarthandheld system When PERCLOS escalating rate increasedmore than 200 the system on Nexus7 tablet will judge thedriver as fatigued Table 4 shows the detection performanceonNexus7 It demonstrates the average consumption of kindsof classifiers and also presents the shortest and longest timeof processing a single frame Compared to the detectionperformance on the PC system (see Table 2) we can find thatthe consumption time for each step is extended for Nexus7tablet Because the main frequency of the tablet is muchlower than the PC (21 GHz) accordingly the longest and theshortest processing time for single frame are also extendedFor normal situation that is the driver being almost infront face the performance of DFDWS on Nexus7 tablet canachieve the speed of 14 framessecThat means mobile device

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 10: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

10 Journal of Sensors

Table 4 Average consumption of each step on tablet system (ms)

Running items Consuming timeFront face detection 39Left and right deflected face detection 49Open state detection 8Close state detection 4Other time consumptions 36The longest processing time of single frame 98The shortest processing time of single frame 70

can smoothly run this application But for the worst casesuch as the situation of side face the system performance willdecrease to 10 framessec

5 Conclusions

This paper provides a practical driving fatigue detectionsystem based on Adaboost algorithm We proposed a newstrategy to detect eye state instead of detecting eye directlyIn our detection strategy we first detect face efficientlyusing classifiers of both front face and deflected face Thencandidate region of eye is determined according to geomet-ric distribution of facial organs Finally trained classifiersof open eyes and closed eyes are used to detect eyes inthe candidate region quickly and accurately As a resultPERCLOS escalating rate could be calculated and used asthe index of fatigue When the PERCLOS escalating rateincreased more than 200 the driver could be consideredin fatigue state Moreover this paper implemented a DrivingFatigue Detection Warning System and transplanted it onNexus7 tablet The system makes decision of fatigued ornot according to PERCLOS and duration of closed-eyesstate Experiments demonstrated that the proposed systemhas a high accuracy Meanwhile the processing speed canreach 30 fps on PC and 14 fps on tablet which meets therequirement of real time

Of course this system could make further improvementon accuracy and speed of detection by using discrete cosinecoefficients [24] and covariance feature [25] respectively Inaddition this paper has dodged the conditions under poorillumination It should be perfected in the future research

Conflict of Interests

The authors declare that there is no conflict of interestsregarding the publication of this paper

Authorsrsquo Contribution

WanzengKong andLingxiaoZhou contributed equally to thiswork

Acknowledgment

This work was supported by the Major International Coop-eration Project of Zhejiang Province (Grant no 2011C14017)

the National Natural Science Foundation of China (Grantno 61102028) the International Science amp Technology Coop-eration Program of China (Grant no 2014DFG12570) andZhejiang Provincial Natural Science Foundation of China(Grant no LY13F020033) The authors also should thank allsubjects who were involved in the driving experiment

References

[1] S R Paul NHTSArsquos Drowsy Driver Technology Program 2005[2] L Wang X Wu and M Yu ldquoReview of driver fatiguedrowsi-

ness detectionmethodsrdquo Journal of Biomedical Engineering vol24 no 1 pp 245ndash248 2007

[3] G Borghini L Astolfi G Vecchiato D Mattia and F BabilonildquoMeasuring neurophysiological signals in aircraft pilots andcar drivers for the assessment of mental workload fatigue anddrowsinessrdquo Neuroscience and Biobehavioral Reviews vol 44pp 58ndash75 2014

[4] T Myllyla V Korhonen E Vihriala et al ldquoHuman heart pulsewave responses measured simultaneously at several sensorplacements by twoMR-compatible fibre opticmethodsrdquo Journalof Sensors vol 2012 Article ID 769613 8 pages 2012

[5] M Simon E A Schmidt W E Kincses et al ldquoEEG alphaspindlemeasures as indicators of driver fatigue under real trafficconditionsrdquo Clinical Neurophysiology vol 122 no 6 pp 1168ndash1178 2011

[6] S K L Lal and A Craig ldquoReproducibility of the spectral com-ponents of the electroencephalogram during driver fatiguerdquoInternational Journal of Psychophysiology vol 55 no 2 pp 137ndash143 2005

[7] B T Jap S Lal P Fischer and E Bekiaris ldquoUsing EEG spectralcomponents to assess algorithms for detecting fatiguerdquo ExpertSystems with Applications vol 36 no 2 pp 2352ndash2359 2009

[8] J Krajewski D Sommer U Trutschel et al ldquoSteering wheelbehavior based estimation of fatiguerdquo in Proceedings of the 5thInternational Driving Symposium on Human Factors in DriverAssessment Training and Vehicle Design pp 118ndash124 2009

[9] Y Wu J Li D Yu and H Cheng ldquoResearch on quantitativemethod about driver reliabilityrdquo Journal of Software vol 6 no6 pp 1110ndash1116 2011

[10] A Eskandarian and A Mortazavi ldquoEvaluation of a smartalgorithm for commercial vehicle driver drowsiness detectionrdquoin Proceedings of the IEEE Intelligent Vehicles Symposium pp553ndash559 June 2007

[11] C X Huang W C Zhang C G Huang and Y J ZhongldquoIdentification of driver state based on ARX in the automobilesimulatorrdquo Technology amp Economy in Areas of Communicationsvol 10 no 2 pp 60ndash63 2008

[12] Z Zhang and J-S Zhang ldquoDriver fatigue detection based intel-ligent vehicle controlrdquo in Proceedings of the 18th InternationalConference on Pattern Recognition (ICPR rsquo06) vol 2 pp 1262ndash1265 IEEE Hong Kong August 2006

[13] Q Wang J Yang M Ren and Y Zheng ldquoDriver fatiguedetection a surveyrdquo in Proceedings of the 6th World Congresson Intelligent Control and Automation (WCICA 06) vol 2 pp8587ndash8591 Dalian China 2006

[14] Z Zhang and J Zhang ldquoA new real-time eye tracking for driverfatigue detectionrdquo in Proceedings of the IEEE 6th InternationalConference on ITS Telecommunications pp 8ndash11 June 2006

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 11: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

Journal of Sensors 11

[15] R Parsai and P Bajaj ldquoIntelligent monitoring system fordriverrsquos alertness (a vision based approach)rdquo in Knowledge-Based Intelligent Information and Engineering Systems vol 4692of Lecture Notes in Computer Science pp 471ndash477 SpringerBerlin Germany 2007

[16] M Eriksson and N P Papanikolopoulos ldquoEye-tracking fordetection of driver fatiguerdquo in Proceedings of the IEEE Confer-ence on Intelligent Transportation Systems (ITSC rsquo97) pp 314ndash319 1997

[17] S M Shi L S Jin R B Wang and B L Tong ldquoDriver mouthmonitoring method based on machine visionrdquo Journal of JiLinUniversity vol 34 no 2 pp 232ndash236 2004

[18] W Wang J Xi and H Chen ldquoModeling and recognizingdriver behavior based on driving data a surveyrdquoMathematicalProblems in Engineering vol 2014 Article ID 245641 20 pages2014

[19] Q Tan M Yang T Luo et al ldquoA novel interdigital capacitorpressure sensor based on LTCC technologyrdquo Journal of Sensorsvol 2014 Article ID 431503 6 pages 2014

[20] Y Freund and R E Schapire ldquoA desicion-theoretic general-ization of on-line learning and an application to boostingrdquo inComputational Learning Theory vol 904 of Lecture Notes inComputer Science pp 23ndash37 Springer Berlin Germany 1995

[21] P Viola and M J Jones ldquoRobust real-time object detectionrdquoInternational Journal of Computer Vision vol 57 no 2 pp 137ndash154 2004

[22] D F Dinges and R Grace PERCLOS A Valid Psychophysiolog-ical Mesure of Alertness As Assessed by Psychomotor VigilanceFederal Highway Administration Office of Motor Carriers1998

[23] W Kong Z Zhou L Zhou Y Dai G Borghini and FBabiloni ldquoEstimation for driver fatigue with phase lockingvaluerdquo International Journal of Bioelectromagnetism vol 14 no3 pp 115ndash120 2012

[24] J P Dukuzumuremyi B Zou C P Mukamakuza D Hanyur-wimfura and EMasabo ldquoDiscrete cosine coefficients as imagesfeatures for fire detection based on computer visionrdquo Journal ofComputers vol 9 no 2 pp 295ndash300 2014

[25] R Li and C F Li ldquoAdaboost face detection based on improvedcovariance featurerdquo Journal of Computers vol 9 no 5 pp 1077ndash1082 2014

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of

Page 12: Research Article A System of Driving Fatigue Detection ...Research Article A System of Driving Fatigue Detection Based on Machine Vision and Its Application on Smart Device WanzengKong,

International Journal of

AerospaceEngineeringHindawi Publishing Corporationhttpwwwhindawicom Volume 2014

RoboticsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Active and Passive Electronic Components

Control Scienceand Engineering

Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

International Journal of

RotatingMachinery

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporation httpwwwhindawicom

Journal ofEngineeringVolume 2014

Submit your manuscripts athttpwwwhindawicom

VLSI Design

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Shock and Vibration

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawi Publishing Corporation httpwwwhindawicom

Volume 2014

The Scientific World JournalHindawi Publishing Corporation httpwwwhindawicom Volume 2014

SensorsJournal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Modelling amp Simulation in EngineeringHindawi Publishing Corporation httpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

Navigation and Observation

International Journal of

Hindawi Publishing Corporationhttpwwwhindawicom Volume 2014

DistributedSensor Networks

International Journal of