Advertisement

In Chap. 5, we described Markov chains in which each state corresponds to an observable physical event/object. However, this model is too restrictive to be applicable to many problems of interest. In this chapter, we present hidden Markov models that extend Markov chains by adding more model freedoms while avoiding a substantial complication to the basic structure of Markov chains. Hidden Markov models achieve this additional model freedoms by letting the states of the chain generate observable data and hiding the state sequence itself from the observer.

We start the chapter with a sample problem that can not be fully modeled by Markov chains. Through this example, we demonstrate why more model freedoms are needed, and how Markov chains are extended by Hidden Markov Models (HMM). Following the introduction, we present three basic problems for HMMs and describe their respective solutions. We also introduce the Expectation-Maximization (EM) algorithm and use it to prove the Baum-Welch algorithm. The EM algorithm is a very powerful, general method that is applicable to many training-based model estimation problems, while the Baum-Welch algorithm is a special version of the EM algorithm that is particularly useful for estimating maximum likelihood parameters for HMMs. At the end of the chapter, we provide a case study to apply HMMs for the task of baseball highlight detections from TV broadcasted videos.

Keywords

State Sequence Observation Sequence Hide Markov Model Model View Type Initial State Distribution 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media, LLC 2007

Personalised recommendations