hidden markov model & stock prediction
DESCRIPTION
Introducing how to apply HMM on stock predictionTRANSCRIPT
![Page 1: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/1.jpg)
HMM & Stock PredictionDavid Chiu @ ML/DM Mondayhttp://ywchiu-tw.appspot.com/
![Page 2: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/2.jpg)
HIDDEN MARKOV MODEL
• Finite state machine which has some fixed number of states
• Provides a probabilistic framework for modeling a time series of multivariate observations
![Page 3: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/3.jpg)
STOCK PRICE PREDICTION
![Page 4: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/4.jpg)
Every time history repeats itself
• Stock behavior of past is similar to behavior of current day• The Next day’s stock price should follow about the same past data
pattern
![Page 5: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/5.jpg)
BENEFIT OF USING HMM
• Handle new data robustly
• Computationally efficient to develop and evaluate
• Able to predict similar patterns efficiently
![Page 6: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/6.jpg)
HMM ON STOCK PREDICTION
• Using the trained HMM, likelihood value P for current day’s dataset is calculated
• From the past dataset using the HMM we locate those instances that would produce the nearest P likelihood value.
![Page 7: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/7.jpg)
CHARACTERIZE HMM
• Number of states in the model: N
• Number of observation symbols: M
• Transition matrix A = {aij} , where aij represents the transition probability from state i to state j
• Observation emission matrix B = {bj(Ot)} , where bj(Ot) represent the probability of observing Ot at state j
• Initial state distribution π = {πi}
![Page 8: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/8.jpg)
MODELING HMM
![Page 9: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/9.jpg)
![Page 10: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/10.jpg)
PROBLEM OF HMM
1. The Evaluation Problem - Forward– What is the probability that the given observations O =
o1 ,o2 ,...,oT are generated by the model p{O|λ} with a given HMM λ ?
2. The Decoding Problem - Viterbi– What is the most likely state sequence in the given model λ that
produced the given observations O = o1 ,o2 ,...,oT ?
3. The Learning Problem - Baum-Welch– How should we adjust the model parameters {A,B,π } in order to
maximize p{O|λ} , whereat a model λ and a sequence of observations O = o1 ,o2 ,...,oT are given?
![Page 11: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/11.jpg)
BAUM-WELCH ALGORITHM
• Find the unknown parameters of a hidden Markov model(HMM).
• Generalized expectation-maximization (GEM) algorithm
• Compute maximum likelihood estimates and posterior mode estimates for the parameters (transition and emission probabilities) of an HMM, when given only emissions as training data.
![Page 12: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/12.jpg)
FIREARM
![Page 13: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/13.jpg)
TOOL KIT
• R Package– HMM– RHMM
• JAVA– JHMM
• Python– Scikit Learn
![Page 14: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/14.jpg)
DEMO
![Page 15: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/15.jpg)
GET DATASET
• library(quantmod)
• getSymbols("^TWII")
• chartSeries(TWII)
• TWII_Subset<- window(TWII, start = as.Date("2012-01-01"))
• TWII_Train <- cbind(TWIISubset$TWII.Close - TWII_Subset$TWII.Open, TWII_Subset$TWII.Volume)
![Page 16: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/16.jpg)
BUILD HMM MODEL
# Include RHMM Library
•library(RHmm)
# Baum-Welch Algorithm
•hm_model <- HMMFit(obs =TWII_Train , nStates = 5)
# Viterbi Algorithm
•VitPath <- viterbi(hm_model, TWII_Train)
![Page 17: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/17.jpg)
SCATTER PLOT
• TWII_Predict <- cbind(TWII_Subset$TWII.Close, VitPath$states)• chartSeries(TWII_Predict[,1])• addTA(TWII_Predict[TWII_Predict[,2]==1,1],on=1,type="p",col=5,pch=25)• addTA(TWII_Predict[TWII_Predict[,2]==2,1],on=1,type="p",col=6,pch=24)• addTA(TWII_Predict[TWII_Predict[,2]==3,1],on=1,type="p",col=7,pch=23)• addTA(TWII_Predict[TWII_Predict[,2]==4,1],on=1,type="p",col=8,pch=22)• addTA(TWII_Predict[TWII_Predict[,2]==5,1],on=1,type="p",col=10,pch=21)
![Page 18: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/18.jpg)
DATA VISUALIZATION
![Page 19: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/19.jpg)
SCIKIT LEARN
#Baum-Welch Algorithm
•n_components = 5
•model = GaussianHMM(n_components, "diag")
•model.fit([X], n_iter=1000)
# predict the optimal sequence of internal hidden state
•hidden_states = model.predict(X)
![Page 20: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/20.jpg)
MODELING SAMPLE
![Page 21: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/21.jpg)
MODELING SAMPLE
![Page 22: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/22.jpg)
![Page 23: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/23.jpg)
PREDICTION
#State Prediction – using Scikit-learn
•data_vec = [diff[last_day], volume[last_day]]
•State = model.predict([data_vec])
![Page 24: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/24.jpg)
REFERENCE
• Hassan, M. (2009). A combination of hidden Markov model and fuzzy model for stock market forecasting. Neurocomputing, 72(16), 3439-3446.
• Gupta, A., & Dhingra, B. (2012, March). Stock Market Prediction Using Hidden Markov Models. In Engineering and Systems (SCES), 2012 Students Conference on (pp. 1-4). IEEE.
![Page 25: Hidden Markov Model & Stock Prediction](https://reader035.vdocuments.site/reader035/viewer/2022062303/554efabeb4c90577698b4a86/html5/thumbnails/25.jpg)