In the previous lecture, we discussed tagging problems, especially the part-of-speech tagging tasks, and introduced the classic model for this class of problems, the Hidden Markov Model (HMMs). In the 13th lecture, we continue with POS tagging and HMM, and, after a short recap, introduce the classic dynamic programming algorithm for tagging with HMMs ( Viterbi algorithm for HMMs). We also discuss how the algorithm can be easily modified so that it computes the probability of a sequence of words, i.e. explain how HMM can be used as a language model. We also provide an intuition for how HMMs can be estimated in an unsupervised way (i.e. the forward-backward algorithm), though we will not describe in detail.
Please do the required reading and attempt the quiz.
If there is anything you don't understand, then please ask questions in the lecture or on piazza.
Slides and required reading
Reading: Sections 6.1-6.4 in Jurafsky & Martin (2nd edition).
Quiz 13: PoS tagging / algorithms
These questions are designed to test your understanding of the above course content; doing this quiz does not contribute to your overall grade. Some questions require a text answer. You can ask for formative feedback on these from your tutor or on piazza. Other questions are multiple choice or they require a numeric answer: you will get immediate feedback for these. Please don't attempt this quiz until you have acquainted yourself with this lecture and the required reading.
You must be logged onto Learn to do this quiz.