Abstract
In this chapter, I resolve two problems that you might not have noticed in the previous chapter. First, HMMs aren’t that natural for many sequences, because a model that represents (say) ink conditioned on (say) a letter is odd. Generative models like this must often do much more work than is required to solve a problem, and modelling the letter conditioned on the ink is usually much easier (this is why classifiers work). Second, in many applications you would want to learn a model that produces the right sequence of hidden states given a set of observed states, as opposed to maximizing likelihood.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Forsyth, D. (2019). Learning Sequence Models Discriminatively. In: Applied Machine Learning . Springer, Cham. https://doi.org/10.1007/978-3-030-18114-7_14
Download citation
DOI: https://doi.org/10.1007/978-3-030-18114-7_14
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-18113-0
Online ISBN: 978-3-030-18114-7
eBook Packages: Computer ScienceComputer Science (R0)