Skip to main content

Markov Chains and Hidden Markov Models

  • Chapter
  • First Online:
Probability and Statistics for Computer Science
  • 249k Accesses

Abstract

There are many situations where one must work with sequences. Here is a simple, and classical, example. We see a sequence of words, but the last word is missing. I will use the sequence “I had a glass of red wine with my grilled xxxx”. What is the best guess for the missing word? You could obtain one possible answer by counting word frequencies, then replacing the missing word with the most common word. This is “the”, which is not a particularly good guess because it doesn’t fit with the previous word. Instead, you could find the most common pair of words matching “grilled xxxx”, and then choose the second word. If you do this experiment (I used Google Ngram viewer, and searched for “grilled *”), you will find mostly quite sensible suggestions (I got “meats”, “meat”, “fish”, “chicken”, in that order). If you want to produce random sequences of words, the next word should depend on some of the words you have already produced.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Forsyth, D. (2018). Markov Chains and Hidden Markov Models. In: Probability and Statistics for Computer Science. Springer, Cham. https://doi.org/10.1007/978-3-319-64410-3_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-64410-3_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-64409-7

  • Online ISBN: 978-3-319-64410-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics