Package HMMCont. February 19, 2015

Similar documents
Package gibbs.met. February 19, 2015

Package CMPControl. February 19, 2015

Hidden Markov Model for Sequential Data

Hidden Markov Models. Slides adapted from Joyce Ho, David Sontag, Geoffrey Hinton, Eric Xing, and Nicholas Ruozzi

Invariant Recognition of Hand-Drawn Pictograms Using HMMs with a Rotating Feature Extraction

A Visualization Tool to Improve the Performance of a Classifier Based on Hidden Markov Models

Package datasets.load

ECE521: Week 11, Lecture March 2017: HMM learning/inference. With thanks to Russ Salakhutdinov

Package rpst. June 6, 2017

Package sbf. R topics documented: February 20, Type Package Title Smooth Backfitting Version Date Author A. Arcagni, L.

Time series, HMMs, Kalman Filters

An Introduction to Hidden Markov Models

Package sparsereg. R topics documented: March 10, Type Package

Package CGP. June 12, 2018

Package gibbs.met documentation

Package corclass. R topics documented: January 20, 2016

Package fractalrock. February 19, 2015

Package CvM2SL2Test. February 15, 2013

DISCRETE HIDDEN MARKOV MODEL IMPLEMENTATION

Package Rambo. February 19, 2015

ModelStructureSelection&TrainingAlgorithmsfor an HMMGesture Recognition System

Assignment 2. Unsupervised & Probabilistic Learning. Maneesh Sahani Due: Monday Nov 5, 2018

HMM-Based Handwritten Amharic Word Recognition with Feature Concatenation

CS839: Probabilistic Graphical Models. Lecture 10: Learning with Partially Observed Data. Theo Rekatsinas

Supervised and unsupervised classification approaches for human activity recognition using body-mounted sensors

Implicit segmentation of Kannada characters in offline handwriting recognition using hidden Markov models

Package GaDiFPT. February 19, 2015

Package cwm. R topics documented: February 19, 2015

Package fso. February 19, 2015

Package hsiccca. February 20, 2015

THE most popular training method for hidden Markov

NOVEL HYBRID GENETIC ALGORITHM WITH HMM BASED IRIS RECOGNITION

Package ECctmc. May 1, 2018

Package svalues. July 15, 2018

Semi-blind Block Channel Estimation and Signal Detection Using Hidden Markov Models

Package SparseFactorAnalysis

Package weco. May 4, 2018

An Introduction to Pattern Recognition

Improving Time Series Classification Using Hidden Markov Models

The nor1mix Package. August 3, 2006

Package MIICD. May 27, 2017

Package manet. September 19, 2017

application of learning vector quantization algorithms. In Proceedings of the International Joint Conference on

Package beast. March 16, 2018

Jahmm v Jean-Marc François

Package SPIn. R topics documented: February 19, Type Package

Package SoftClustering

Package MfUSampler. June 13, 2017

Package CepLDA. January 18, 2016

Package Daim. February 15, 2013

Package signalhsmm. August 29, 2016

Package ConvergenceClubs

Package qrfactor. February 20, 2015

The nor1mix Package. June 12, 2007

Package libstabler. June 1, 2017

Package kofnga. November 24, 2015

the number of states must be set in advance, i.e. the structure of the model is not t to the data, but given a priori the algorithm converges to a loc

Package dirmcmc. R topics documented: February 27, 2017

Expectation Maximization. Machine Learning 10701/15781 Carlos Guestrin Carnegie Mellon University

Package alphastable. June 15, 2018

Package bisect. April 16, 2018

Package coxsei. February 24, 2015

Software/Hardware Co-Design of HMM Based Isolated Digit Recognition System

Package PedCNV. February 19, 2015

Recognition of Assembly Tasks Based on the Actions Associated to the Manipulated Objects

Package StVAR. February 11, 2017

Package esabcv. May 29, 2015

Package clustering.sc.dp

Package coga. May 8, 2018

Package mmtsne. July 28, 2017

Package RNAseqNet. May 16, 2017

Modeling time series with hidden Markov models

Package NormalGamma. February 19, 2015

Skill. Robot/ Controller

Intelligent Hands Free Speech based SMS System on Android

Package XMRF. June 25, 2015

Package mfbvar. December 28, Type Package Title Mixed-Frequency Bayesian VAR Models Version Date

Biology 644: Bioinformatics

Package lclgwas. February 21, Type Package

Treba: Efficient Numerically Stable EM for PFA

Package ParetoPosStable

Package TMixClust. July 13, 2018

Package effectr. January 17, 2018

A reevaluation and benchmark of hidden Markov Models

Package sspline. R topics documented: February 20, 2015

Evaluation of Model-Based Condition Monitoring Systems in Industrial Application Cases

Package dgo. July 17, 2018

Package RYoudaoTranslate

Package cosinor. February 19, 2015

Package bayesdp. July 10, 2018

Package RCA. R topics documented: February 29, 2016

Constraints in Particle Swarm Optimization of Hidden Markov Models

Speech Recognition Lecture 8: Acoustic Models. Eugene Weinstein Google, NYU Courant Institute Slide Credit: Mehryar Mohri

Package corenlp. June 3, 2015

Package munfold. R topics documented: February 8, Type Package. Title Metric Unfolding. Version Date Author Martin Elff

Extracting Relevant Snippets for Web Navigation

Package cgh. R topics documented: February 19, 2015

Hidden Markov Models for Spatio-Temporal Pattern Recognition and Image Segmentation

HUMAN activities associated with computational devices

Package longclust. March 18, 2018

Transcription:

Type Package Package HMMCont February 19, 2015 Title Hidden Markov Model for Continuous Observations Processes Version 1.0 Date 2014-02-11 Author Maintainer <mikhail.beketov@gmx.de> The package includes the functions designed to analyse continuous observations processes with the Hidden Markov Model approach. They include Baum-Welch and Viterbi algorithms and additional visualisation functions. The observations are assumed to have Gaussian distribution and to be weakly stationary processes. The package was created for analyses of financial time series, but can also be applied to any continuous observations processes. LazyData yes License GPL-3 NeedsCompilation no Repository CRAN Date/Publication 2014-02-11 17:15:51 R topics documented: HMMCont-package..................................... 2 baumwelchcont....................................... 3 hmmcontsimul....................................... 4 hmmsetcont......................................... 6 logreturns.......................................... 7 Prices............................................ 8 statesdistributionsplot................................... 9 viterbicont.......................................... 10 Index 11 1

2 HMMCont-package HMMCont-package Hidden Markov Model for Continuous Observations Processes Details The package includes the functions designed to analyse continuous observations processes with the Hidden Markov Model approach. They include Baum-Welch and Viterbi algorithms and additional visualisation functions. The observations are assumed to have Gaussian distribution and to be weakly stationary processes. The package was created for analyses of financial time series, but can also be applied to any continuous observations processes. Package: HMMCont Type: Package Version: 1.0 Date: 2014-02-11 License: GPL 3 A Hidden Markov Model (HMM) is a statistical model in which the process being modelled is assumed to be a Markov process with unobserved, i.e. hidden states. This unobserved Markov process can be revealed from an observable process that is dependent on the states of the underlying Markov process. The HMMCont package compiles the functions that can analyse the continuous observable processes (i.e. continuous in space, discrete in time) and identify the underlying two-states Markov processes. The observable process should be weakly stationary (e.g. in case of financial time series the returns, but not the prices should be analysed). The state-dependent probabilities of the observations are modelled with Gaussian probability density functions (Rabiner, 1989). The implemented analysis procedure includes: (i) setting the initial model parameters and loading the data (function hmmsetcont), repeated execution of the Baum-Welch algorithm (function baumwelchcont), and execution of the Viterbi algorithm (viterbicont). The function baumwelchcont allows to control the model parameters after each Baum-Welch iteration, and accumulates the information on the model evolution. The model object can be analysed with tailored print, summary, and plot functions (S3 methods). For details on HMMs see the publications by Viterbi (1967), Baum et al (1970), and Rabiner (1989). Maintainer: <mikhail.beketov@gmx.de> References Baum, L.E., Petrie, T., Soules, G., and Weiss, N. 1970. A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains. The Annals of Mathematical Statistics. 41: 164-171.

baumwelchcont 3 Rabiner, L.R. 1989. A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE. 77: 257-286. Viterbi, A.J. 1967. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory. 13: 260-269. Functions: hmmsetcont, baumwelchcont, viterbicont, statesdistributionsplot, and logreturns. # Step-by-step analysis example. Returns<-logreturns(Prices) # Getting a stationary process Returns<-Returns*10 # Scaling the values hmm<-hmmsetcont(returns) # Creating a HMM object print(hmm) # Checking the initial parameters for(i in 1:6){hmm<-baumwelchcont(hmm)} # Baum-Welch is # executed 6 times and results are accumulated print(hmm) # Checking the accumulated parameters summary(hmm) # Getting more detailed information hmmcomplete<-viterbicont(hmm) # Viterbi execution statesdistributionsplot(hmmcomplete, sc=10) # PDFs of # the whole data set and two states are plotted par(mfrow=c(2,1)) plot(hmmcomplete, Prices, ylabel="price") plot(hmmcomplete, ylabel="returns") # the revealed # Markov chain and the observations are plotted baumwelchcont Baum-Welch Algorithm The function performs Baum-Welch algorithm with Gaussian PDFs (Baum et al, 1970; Rabiner, 1989). It allows to control the model parameters after each iteration, and accumulates the information on the model evolution. The intended use is to perform repeated executions and to save the returned object into the argument object (see examples below). baumwelchcont(hmm) hmm An object of the class ContObservHMM.

4 hmmcontsimul An object of the class ContObservHMM (see section on the function hmmsetcont). After sufficient number of iterations the object can be used to derive the Markov states sequence by the Viterbi algorithm (function viterbicont). The object can be analysed with the class-specific functions print, summary, and plot. References Baum, L.E., Petrie, T., Soules, G., and Weiss, N. 1970. A maximization technique occurring in the statistical analysis of probabilistic functions of Markov chains. The Annals of Mathematical Statistics. 41: 164-171. Rabiner, L.R. 1989. A tutorial on hidden Markov models and selected applications in speech recognition. Proceedings of the IEEE. 77: 257-286. Functions: hmmsetcont, viterbicont, and statesdistributionsplot. Returns<-logreturns(Prices) # Getting a stationary process Returns<-Returns*10 # Scaling the values hmm<-hmmsetcont(returns) # Creating a HMM object print(hmm) # Checking the initial parameters hmm<-baumwelchcont(hmm) print(hmm) # Inspecting # First iteration for(i in 1:5){hmm<-baumwelchcont(hmm)} # Subsequent iterations print(hmm) # Inspecting hmmcomplete<-viterbicont(hmm) # Viterbi execution par(mfrow=c(2,1)) plot(hmmcomplete, Prices, ylabel="price") plot(hmmcomplete, ylabel="returns") # the revealed # Markov chain and the observations are plotted hmmcontsimul Simulation of an observation and underlying Markov processes according to a given model

hmmcontsimul 5 The function simulates (i) two observation processes that correspond to the distributions of the two states in an HMM, (ii) the underlying Markov process, and (iii) an observation process that correspond to an HMM in terms of both the underlying Markov and observations processes. The function uses the last-iteration parameters, when the HMM-object was modified by function baumwelchcont previously. hmmcontsimul(hmm, n) hmm n An object of the class ContObservHMM Number of observations to be simulated. The function returns an object of the class SimulContHMM that is a list comprising the two observations processes (each corresponds to the distribution of one of the two states), the Markov chain (i.e. the underlying process consisting of two states), and the observation process that correspond to that Markov chain and the two distributions. Hence, the latter is the process simulated according a given HMM. Functions: hmmsetcont, baumwelchcont, and viterbicont. Returns<-(logreturns(Prices))*10 hmm<-hmmsetcont(returns) for(i in 1:6){hmm<-baumwelchcont(hmm)} hmmcomplete<-viterbicont(hmm) sim<-hmmcontsimul(hmmcomplete, n=100) # simulating the processes plot(sim$stateprocess1, type="l", ylab="state 1 Process") plot(sim$stateprocess2, type="l", ylab="state 2 Process") plot(sim$markovchain, type="l", ylab="markov chain") plot(sim$simulatedobservation, type="l", ylab="full HMM Process")

6 hmmsetcont hmmsetcont Setting an initial HMM object The function sets an initial Hidden Markov Model object with initial set of model parameters. It returns the object of class ContObservHMM that can be analysed with Baum-Welch (function baumwelchcont) and Viterbi algorithms (viterbicont). hmmsetcont(observations, Pi1 = 0.5, Pi2 = 0.5, A11 = 0.7, A12 = 0.3, A21 = 0.3, A22 = 0.7, Mu1 = 5, Mu2 = (-5), Var1 = 10, Var2 = 10) ## S3 method for class ContObservHMM print(x,...) ## S3 method for class ContObservHMM summary(object,...) ## S3 method for class ContObservHMM plot(x, Series=x$Observations, ylabel="observation series", xlabel="time",...) Observations Vector of observations (class "numeric"), a weakly stationary process (e.g. returns time series). Pi1 Initial probability of state 1. Pi2 Initial probability of state 2. A11 Initial transition probability from state 1 to state 1. A12 Initial transition probability from state 1 to state 2. A21 Initial transition probability from state 2 to state 1. A22 Initial transition probability from state 2 to state 2. Mu1 Initial mean for Gaussian PDF for state 1. Mu2 Initial mean for Gaussian PDF for state 2. Var1 Initial variance for Gaussian PDF for state 1. Var2 Initial variance for Gaussian PDF for state 2. x object Series ylabel xlabel An object returned by the function hmmsetcont. An object returned by the function hmmsetcont. Observations time series to be plotted along the Markov states. Y axis label. X axis label.... Not used.

logreturns 7 The function returns an object of the class ContObservHMM that is a list comprising the observations, tables accumulating the model parameters and results after each Baum-Welch iterations (i.e. after each execution of the function baumwelchcont), table for the state sequence derived by the Viterbi algorithm (function viterbicont), and table of the b-probabilities. The object can be analysed with the class-specific functions print, summary, and plot. Functions: baumwelchcont, viterbicont, and statesdistributionsplot. Returns<-logreturns(Prices) # Getting a stationary process Returns<-Returns*10 # Scaling the values hmm<-hmmsetcont(returns) # Creating a HMM object print(hmm) # Checking the initial parameters for(i in 1:6){hmm<-baumwelchcont(hmm)} # Baum-Welch is # executed 6 times and results are accumulated hmmcomplete<-viterbicont(hmm) # Viterbi execution print(hmm) # Checking the accumulated parameters summary(hmm) # Getting more detailed information par(mfrow=c(2,1)) plot(hmmcomplete, Prices, ylabel="price") plot(hmmcomplete, ylabel="returns") # the revealed # Markov chain and the observations are plotted logreturns Calculating Log-returns Simple function that calculates log-returns from prices time series. logreturns(x) x Vector of prices or an index values (class "numeric").

8 Prices Vector of log-returns (class "numeric"). Functions: hmmsetcont, baumwelchcont, viterbicont, and statesdistributionsplot. Returns<-logreturns(Prices) par(mfrow=c(2,1)) plot(prices, type="l") plot(returns, type="l") Prices A dummy data set of prices. A dummy data set of prices or a non-stationary random process. data(prices) Format The format is: num [1:168] 1394 1366 1499 1452 1421... Source A dummy data set generated by the author. plot(prices, type="l")

statesdistributionsplot 9 statesdistributionsplot Probability Density Functions of the States The function plots the Gaussian probability density functions from the means and variances of the whole data set, the two sub-sets corresponding to the two Markov chain states, and additionally from the HMM model (i.e. the means and variances taken form the last Baum-Welch iteration). statesdistributionsplot(hmm, sc = 1) hmm sc An object of the class ContObservHMM. Scaling factor used when the initial HMM-object was set. Plot of the probability density functions. Functions: hmmsetcont, baumwelchcont, and viterbicont. Returns<-logreturns(Prices) # Getting a stationary process Returns<-Returns*10 # Scaling the values hmm<-hmmsetcont(returns) # Creating a HMM object for(i in 1:6){hmm<-baumwelchcont(hmm)} # Baum-Welch is # executed 6 times and results are accumulated hmmcomplete<-viterbicont(hmm) # Viterbi execution statesdistributionsplot(hmmcomplete, sc=10) # PDFs of # the whole data set and two states are plotted

10 viterbicont viterbicont Viterbi Algorithm The function performs Viterbi algorithm (Viterbi, 1967). It can be applied to a ContObservHMM object after sufficient number of Baum-welch iterations (function baumwelchcont). viterbicont(hmm) hmm An object of the class ContObservHMM. An object of the class ContObservHMM (see section on the function hmmsetcont). The object can be analysed with the class-specific functions print, summary, and plot. References Viterbi, A.J. 1967. Error bounds for convolutional codes and an asymptotically optimum decoding algorithm. IEEE Transactions on Information Theory. 13: 260-269. Functions: hmmsetcont, baumwelchcont, and statesdistributionsplot. Returns<-logreturns(Prices) # Getting a stationary process Returns<-Returns*10 # Scaling the values hmm<-hmmsetcont(returns) # Creating a HMM object for(i in 1:6){hmm<-baumwelchcont(hmm)} # Baum-Welch is # executed 6 times and results are accumulated hmmcomplete<-viterbicont(hmm) # Viterbi execution par(mfrow=c(2,1)) plot(hmmcomplete, Prices, ylabel="price") plot(hmmcomplete, ylabel="returns") # the revealed # Markov chain and the observations are plotted

Index Topic Baum-Welch baumwelchcont, 3 statesdistributionsplot, 9 Topic Continuous observations Topic Econometrics logreturns, 7 Topic Hidden Markov Model baumwelchcont, 3 hmmcontsimul, 4 hmmsetcont, 6 viterbicont, 10 Topic Quantitative Finance Topic Returns logreturns, 7 Topic Simulation hmmcontsimul, 4 Topic Time series hmmsetcont, 6 Topic Viterbi statesdistributionsplot, 9 viterbicont, 10 Topic datasets Prices, 8 Topic Prices, 8 print.contobservhmm (hmmsetcont), 6 statesdistributionsplot, 3, 4, 7, 8, 9, 10 summary.contobservhmm (hmmsetcont), 6 viterbicont, 3 5, 7 9, 10 baumwelchcont, 3, 3, 5, 7 10 HMMCont (HMMCont-package), 2 hmmcontsimul, 4 hmmsetcont, 3 5, 6, 8 10 logreturns, 3, 7 plot.contobservhmm (hmmsetcont), 6 11