22t 2+m``2mil2m` hl2irq`f# b2/ jmhib@jq/ h: bi mqk hv.2i2 ... · *qmi2mib #bi` +i b gbbiq77b;m`2b...
TRANSCRIPT
99%
R2 R3
1
2010 2022
270 2022 430
2010
Smar
tpho
ne u
sers
in m
illio
ns
62.662.6
92.892.8
122122
144.5144.5
171171
190.64190.64
208.61208.61
224.3224.3237.6237.6
248.68248.68257.76257.76
264.85264.85270.66270.66
2010 2011 2012 2013 2014 2015 2016 2017* 2018* 2019* 2020* 2021* 2022*
Figure 1.1: Estimated growth of smartphones number in the United States [1]. From 2010 to 2016 the graph
show exact data, from 2017 to 2022 estimations are provided.
1.1 Gait analysis formalization
Phases
Periods InitialContact% Cycle
Stance Phase Swing Phase
0%
LoadingResponse
TerminalStance
PreSwing
InitialSwing
MidSwing
TerminalSwingMidstance
12% 50% 62% 100%
Figure 1.2: Gait phases in a normal gait cycle.
1.2 Human Activity Recognition
Table 1.1: Types of human activities studied in literature [2, 3].
1.3 Motivations and Contributions
2
5.45%
33.1 Data gathering
Figure 3.1: Roll, pitch and yaw angles.
3.2 Video information extraction
Formalization
LucasandKanade
n × n
Kanade, Lucas and Tomasi Feature Tracker
3×3
InputOutput
I(t) I(t+ 1)
I(t) v n × n
I(t+ 1)
3.3 Data preprocessing
Interpolation
Filtering
Cycles Extraction
Figure 3.2: Stride, stance and swing times.
ay
ay
Signals detrending
Normalization
4
S D
f H
HS : X → Y
D f
S S
Supervised Learning
Con
tinuo
usD
iscr
ete
Unsupervised Learning
Figure 4.1: Categorization ofMachine Learning problems.
Rn
yi xi
yi =∑
j wjxij
w
4.1 Notable Algorithms
4.1.1 Linear Regression
X Rd d Y
Rh : Rd → R
w ∈ Rd+1
= f(X) = w0 +d∑
j=1
xjwj,
wj j = 0, . . . , d
f(X) X
L(h, (x, y)) = (h(x), y)2
L(h, (x, y)) = |(h(x), y)|
L(h, (x, y)) = !(h(x), y)"!A" = 1 A
minw
1
m
m∑
i=1
(⟨w,xi⟩ − yi)2,
m X ⟨·, ·⟩w
2
m
m∑
i=1
(⟨w,xi⟩ − yi) · xi = 0 ,
A =
(m∑
i=1
xixi⊤
)and b =
m∑
i=1
yixi
w∗
w∗ = A−1b
4.1.2 Support VectorMachines
b
w
= 2w w
ξ = 0
ξ > 1
ξ < 1
w (x) + b = − 1
w (x) + b = 0
w (x) + b = +1
K (x i , x j ) = (x i ) (x j )
Figure 4.2: Example of margin in SVMhyperplane separation.
H = {x |⟨w,x⟩ + b = 0}w
|b| ||w|| w
H1 = {x |⟨w,x⟩+b = 1} H2 = {x |⟨w,x⟩+b = −1}
i ∈ {−1, 1}xi i ∈ {1, ...,m} m
X
( 1, 1), . . . , ( m, m)
( 0, b0) = argmin(w,b)
∥ ∥2 s.t. ∀i, i(⟨ , i⟩+ b) ≥ 1
ˆ = 0∥ 0∥ b = b0
∥ 0∥
ξ
ξ
ξ
(x1, 1), . . . , (xm, m)
λ > 0
min(w,b,ξ)
(λ∥w∥2 + 1
m
m∑
i=1
ξi
)
s.t. ∀i, i(⟨ , i⟩+ b) ≥ 1− ξi and ξi ≥ 0
w b
Kernel trick
K(x,x′) = φ(x)φ(x′) φ(x)
K(x,x′) = x⊤x
K(x,x′) = (γx⊤x+ ζ)d d, γ, ζ > 0
K(x,x′) = (γx⊤x+ ζ)
K(x,x′) = e(−γ∥x−x′∥2) γ > 0
γ, ζ, d
AA
BB
Figure 4.3:Mapping of non-linear separable training data fromR2 intoR3
4.1.3 Random Forests
{h(x,θk), k = 1, 2, . . . } {θk}
x
M
m ≪ M M
m
rd
4.1.4 XGBoost
4.1.5 Recurrent Neural Network
Feed Forward (FF)
Figure 4.4: Example of feed-forward network.
Xi i = 1, . . . , p p
Yk k = 1, . . . , K
K
Z(l)di
l
di
(l)i,j i, j, l
w(l)i,j
i (l − 1) j
(l) j l
Z(l)j = φ
⎛
⎝w(l)0,j +
d(l−1)∑
i=1
w(l)i,j Z
(l−1)i
⎞
⎠ ,
φ
φ(x) = x
φ(x) =1
1 + e−x
φ(x) =2
1 + e−2x− 1
φ(x) =
⎧⎨
⎩0 for x ≤ 0
x for x ≥ 0
(L)
[d(1), d(2), . . . , d(L−1)] Θ
H
(w)
Recurrent Neural Network (RNN)
Figure 4.5: Example of recurrent neural network.
Z
h
ht = Θ(Wxt + Uht−1).
ht
xt W
ht−1 U
U
RecurrentCell
RecurrentCell
RecurrentCell
RecurrentCell
RecurrentCell
h0 h1 h2 ht-1...
Figure 4.6: Example of unrolled BPTT.
ht
t
GRU cell
Gated Recurrent Unit (GRU)
Figure 4.7: Example of recurrent neural network with Gated Recurrent Units.
h
(zt) (rt)
WU WU WU
Figure 4.8: Example of Gated Recurrent Unit.
xt
ht
ht
zt
rt
W,U b
hjt t
hjt−1 hj
t
hjt =
(1− zjt
)hjt−1 + zjth
jt ,
zjt
zjt = σ (Wzxt + Uzht−1)j .
hjt
ht = tanh(Wxt + U(rt ⊙ ht−1))j ,
rt ⊙rjt
rjt = σ (Wrxt + Urht−1)j .
5
1 DATA ACQUISITION• ACCELEROMETER• GYROSCOPE• VIDEO
2 PREPROCESSING• GAIT CYCLES• NINE FEATURES
3 REGRESSION• LINEAR REGRESSION• RECURRENT NEURAL NETWORK
• SUPPORT VECTOR MACHINE• RANDOM FOREST• XGBOOST
5 CLASSIFICATION
• STANDARD/ANOMALOUS• ACCURACY
6FINALPERFORMANCE
• MEAN SQUARED ERROR• STANDARD DEVIATION
4PREDICTIONERROR EXTRACTION
Figure 5.1: General scheme of the Gait Anomaly Detection System.
5.1 Data Acquisition
Figure 5.2: Chest support for smartphone.
m/s2
◦/s
30
720× 576
Figure 5.3: Example of recording application home screen.
5.2 Data preprocessing
Interpolation
fs = 200
Figure 5.4: Comparison of the sampling frequency distribution of the smartphone employed in the data
acquisition (Asus Zenfone 2) and another smartphone (LGNexus 5X).
Filtering
40
Figure 5.5: Power spectral density of the three-axial aceelerometer data.
10 fc = 40
Figure 5.6: Frequency response of the Butterworth filter in blue, cutoff frequency in green.
samples
Figure 5.7: Comparison between raw signal and its filtered version.
The considered signal is the yaw angle evolution.
Cycles extraction
ay
1000 1100 1200 1300 1400 1500 1600 1700 1800 1900 2000Samples
-15
-10
-5
0
5
10
15
m/s
2
detrended input1st gcwt2nd gcwtICFC
Figure 5.8: Example of IC (circles) and FC (triangles) detection.
gy
5000 5200 5400 5600 5800 6000 6200 6400 6600 6800 7000samples
-15
-10
-5
0
5
10
deg/
s
2Hz-filtered + detrended inputICleftICright
Figure 5.9: Example of estimation on left or right step.
fc = 2
i IC(i) IC(i + 2)
5
0 100 200 300 400 500 600 700samples
-15
-10
-5
0
5
10
15
m/s
2
detrended input1st gcwt2nd gcwtICFC
Figure 5.10: Example of initial signal transient.
N = 200 N
τ = 2 B = 40
N > 2Bτ = 160
De-trending
samples
Figure 5.11: Example of trends in data extracted from video.
Green vertical lines separates different acquisition sessions.
Figure 5.12: Example of video detrended data. Green vertical lines separates different acquisition sessions.
samples samples
Figure 5.13: First acquisition sessione, on the left the trend is present, on the right it is removed. Vertical lined
separates different gait cycles.
samples
Figure 5.14: Example of underlying data trend, visible after detrending and normalization are performed.
Normalization
[0, 1]
Dataset division
78
4388 10
578
(75%/25%)
53 61
2975 2744 50%/50%
(75%/25%)
5.3 Regression
RNNArchitecture:
1, 489, 209
Training
b = 1
Figure 5.15: Neural networks structure shown using the tensorboard tool.
std =√
1n
n
batch size = 100
epochs = 10
10
(10 × 9)
(1× 9)
N1 = 3301 N2 = 1087
s1 = 659620 s2 = 217200
Figure 5.16: Evolution of theMSE score throughout several training epochs.
Comparison:
(R2)
Table 5.1: Comparison for mean square of prediction error for different regressors.
Samples
Figure 5.17: Comparison of regressors performance on two cycles of x-axis of the accelerometer signal.
Samples
Figure 5.18: Comparison of regressors performance on two cycles of z-axis of the accelerometer signal.
5.4 Prediction error statistics extraction
si(t)
si(t) Ei(t) = si(t)− si(t) i ∈ {1, . . . 9}
σ
(2×9)
Num
ber o
f ins
tanc
esN
umbe
r of i
nsta
nces
Figure 5.19: Comparison between the distribution ofMSE estimation
across all cycles. Considered signal is the x-axis of accelerometer.
STD
Num
ber o
f ins
tanc
esN
umbe
r of i
nsta
nces
Figure 5.20: Comparison between the distribution of standard deviation on prediction
error across all cycles. Considered signal is the x-axis of accelerometer.
1 0
5.5 Classification
= 3
Figure 5.21: Visualization of grid search scores for the SVM classification algorithms.
γ
F1
CM =
⎛
⎜⎜⎜⎜⎝
TN FN
FP TP
⎞
⎟⎟⎟⎟⎠
=+
+ + +
=+
=+
F1
F1 =2
1 + 1
6
6.5%
Table 6.1: Performance comparison of SVM classifier applied to the cycles
descriptors obtained using LR and RNN regression algorithms.
γ
C = 710 γ = 2.10 C = 570 γ = 2.45
F1
XY
ZX Y Z
SVMLR
TRAIN=
⎛
⎝1811 233
70 2169
⎞
⎠ , SV MLR
TEST=
⎛
⎝611 83
41 695
⎞
⎠
SVMRNN
TRAIN=
⎛
⎝1987 63
18 2221
⎞
⎠ , SV MRNN
TEST=
⎛
⎝669 25
5 731
⎞
⎠
83
25 1/3
5.6% 0.7%
Table 6.2: Performance comparison of RF classifier applied to the cycles
descriptors obtained using LR and RNN regression algorithms
2%
RFLR
TRAIN=
⎛
⎝2050 0
0 2239
⎞
⎠ , RFLR
TEST=
⎛
⎝668 26
22 714
⎞
⎠
230 270
RFRNN
TRAIN=
⎛
⎝2050 0
0 2239
⎞
⎠ , RFRNN
TEST=
⎛
⎝687 7
12 724
⎞
⎠
98.88%
Table 6.3: Performance comparison of XGB classifier applied to the cycles
descriptors obtained using LR and RNN regression algorithms.
XGBLR
TRAIN=
⎛
⎝2050 0
0 2239
⎞
⎠ , XGBLR
TEST=
⎛
⎝673 21
21 715
⎞
⎠
(90, 6, 0.7)
(80, 6, 0.9)
XGBRNN
TRAIN=
⎛
⎝2050 0
0 2239
⎞
⎠ , XGBRNN
TEST=
⎛
⎝689 5
11 725
⎞
⎠
7