HMM tutorials

[ Home | Session: 1, 2, 3, 4, 5, tips | People: Dr Jackson ]

News!

The final tutorial session has been rescheduled for Wednesday 2 June at the same time in the same place, 11:00-12:30 in 40b-AB-05 (CVSSP seminar room).

Hidden Markov Model tutorial sessions

Overview

The purpose of this series of tutorials is to introduce the principles of hidden Markov models in a way that can be applied to many kinds of pattern recognition problem. It is aimed at PhD students, research fellows and academics in the CVSSP group, but other researchers are welcome to attend. The three main elements involve:

1. calculating the probability of a given path through the model's states,
2. finding the optimum state sequence for a set of observations, and
3. estimating the model's parameters during training.
The technique is particularly applicable to problems involving sequences of noisy measurements, as you might find in time series data (e.g., audio or video streams), but the ideas can be extended to spatial data (e.g., medical images). There are five tutorials planned, which will be approximately an hour and a half long (including time for questions):
#DateTopicSlides
111am Wed 21 AprilIntroduction to Markov models and HMMs hmm_tut1.pdf
211am Wed 28 AprilLikelihood calculation and Viterbi decoding hmm_tut2.pdf
311am Wed 5 MayMaximum likelihood re-estimation hmm_tut3.pdf
411am Wed 12 MayOutput probability distribution functions hmm_tut4.pdf
511am Wed 2 JuneExtensions and applications hmm_tut5.pdf
A rough outline of the content is given below, and further details will be added at the course proceeds.

Preparation

None needed, unless you've missed the earlier sessions (in which case you should read through the slides). If you are interested in doing some background reading, I would recommend Rabiner's tutorial article:

L.R. Rabiner. "A tutorial on HMM and selected applications in speech recognition". In Proc. IEEE, Vol. 77, No. 2, pp. 257-286, Feb. 1989.

Tutorial 1: Introduction to Markov models and HMMs

The first session will provide a basic probabilistic framework and mathematical notation for describing finite-state models, which will introduce the Markov model and the hidden Markov model.

Tutorial 2: Likelihood calculation and Viterbi decoding

Using the HMM framework, we will look at how to calculate the probability of a given path through the model's states, and hence find the optimal path to explain a set of observations. I will show how the Viterbi algorithm computes a good approximation to the best path efficiently, which can thus be used for decoding.

Tutorial 3: Maximum likelihood re-estimation

The third key element shows us how to set the model's parameters. From some initial values, the parameters are updated by an Expectation Maximisation process, called Baum-Welch re-estimation. This will be presented first for discrete observations.

Tutorial 4: Output probability distribution functions

The fourth session will present the corresponding results for continuous observations, beginning with a simple Gaussian probability distribution and building up towards multivariate Gaussian mixtures.

Tutorial 5: Extensions and applications

Having presented one particular kind of technique, the hidden Markov model, there are a number of extensions that will be discussed which have been developed to overcome one or other of its limitations. Although I'll use examples throughout the tutorials to illustrate how the algorithms work, the final session will show a couple of longer worked examples. Maybe you'll have tried out a little example yourself by then that you could share with the group!

Links, tips and sources of extra help

Matthew has recommended a book which has a worked example of an HMMs in chapter 20:

R. Callan. Artificial intelligence. Basingstoke: Palgrave Macmillan, 2003. [ISBN 0333801369]
The library has four copies that are held at shelfmark 006.3/CAL.

If you have any other specific questions outside of the tutorials, you can email me and I'll do my best to respond promptly. I'd also be grateful for any handy links that I could add to this page, so send then to me.

 Dept. [ Home | Session: top, 1, 2, 3, 4, 5 | People: Philip ] © 2004, maintained by Philip Jackson, last updated on 2 June 2004. School