Implementing Hidden Markov Models - Retail Example

Learn Aster
Teradata Employee

INTRODUCTION:

Please take a look at the following video regarding Hidden Markov Models (HMM) as explained by Michelle Tanco.  Michelle Tanco is a member of Teradata’s Advanced Analytics Practice and loves solving analytical problems. She primarily works with Teradata’s Aster technology and has expertise in time series analysis, specifically using Aster’s Hidden Markov Model functions.  Attached is the source code she put together for the video so you can follow along.  The data is also contained within Aster Express.  (Retail_Web_Clicks)

Video Link : 1056

A bit more about HMM:

A hidden Markov model (HMM) is a statistical Markov model in which the system being modeled is assumed to be a Markov process with unobserved (hidden) states. An HMM can be presented as the simplest dynamic Bayesian network. The mathematics behind the HMM were developed by L. E. Baum and coworkers.[1][2][3][4][5] It is closely related to an earlier work on the optimal nonlinear filtering problem by Ruslan L. Stratonovich,[6]who was the first to describe the forward-backward procedure.

In simpler Markov models (like a Markov chain), the state is directly visible to the observer, and therefore the state transition probabilities are the only parameters. In a hidden Markov model, the state is not directly visible, but the output, dependent on the state, is visible. Each state has a probability distribution over the possible output tokens. Therefore, the sequence of tokens generated by an HMM gives some information about the sequence of states. The adjective 'hidden' refers to the state sequence through which the model passes, not to the parameters of the model; the model is still referred to as a 'hidden' Markov model even if these parameters are known exactly.

Hidden Markov models are especially known for their application in temporal pattern recognition such as speech, handwriting, gesture recognition,[7] part-of-speech tagging, musical score following,[8] partial discharges[9]and bioinformatics.

A hidden Markov model can be considered a generalization of a mixture model where the hidden variables (or latent variables), which control the mixture component to be selected for each observation, are related through a Markov process rather than independent of each other. Recently, hidden Markov models have been generalized to pairwise Markov models and triplet Markov models which allow consideration of more complex data structures [10][11] and the modelling of nonstationary data.[12][13]

SOURCE: Hidden Markov model - Wikipedia, the free encyclopedia 

Additional Reads on the Topic:

Please check out a great blog on Hidden Markov from Karthik Guruswamy

Data Science - Hidden Markov & the Elusive Genie 

 

How Aster Implements HMM:

Hidden Markov Model Functions
   • HMMUnsupervisedLearner
   • HMMSupervisedLearner
   • HMMEvaluator
   • HMMDecoder

Unsupervised Learning Given an observation sequence and the number of states, find the model that maximizes the probability of the observed sequence.


Supervised Learning Given an observation sequences and states, find the model that maximizes the probability of the observed sequence.


Decoding Given the trained model and an observation sequence, find an optimal state sequence.

Evaluation Given the trained model and an observation sequence, find the probability of the sequence.