This page introduces you to logistic regression (aka maximum entropy model). We start with recalling downsides of the Naive Bayes, and develop a more powerful model. The slides in this folder describe the model, go into details of the estimation procedure and contrast Naive Bayes and logistic regression.
Please do the required reading and attempt the quiz.
If there is anything you don't understand, then please ask questions in the lecture or on piazza.
Slides and required reading
Reading: Jurafsky and Martin 3rd (Online) edition, https://web.stanford.edu/~jurafsky/slp3/
Sections 5.1-5.7.
Quiz from lecture 9 also covers both lectures, if you have not done it, do it now.
Quiz 10: Logistic Regression
These questions are designed to test your understanding of the above course content; doing this quiz does not contribute to your overall grade. Some questions require a text answer. You can ask for formative feedback on these from your tutor or on piazza. Other questions are multiple choice or they require a numeric answer: you will get immediate feedback for these. Please don't attempt this quiz until you have acquainted yourself with this lecture and the required reading.
You must be logged onto Learn to do this quiz.