Week 6: Welcome and checklist
Welcome to Week 6! This week is the second week of lectures about syntax and grammar in the context of natural language processing.
Overview of this week
This week, we focus on basic syntactic parsing and grammar use in NLP. We will first see how syntax is computationally modelled (for example, how do we represent all noun phrases in a sentence?), and then we will show an algorithm to "infer" parse trees given such models. We will describe a related algorithm to the Viterbi algorithm, which, like Viterbi, uses dynamic programming to infer a parse tree.
Lecture material
Slides:
- Lecture 1 (the basics of syntax) [pdf]
- Lecture 2 (parsing with CYK) [pdf]
- Lecture 3 (probabilistic parsing) [pdf]
Readings (from the online version of JM3):
- JM3 17.1-17.4
- JM3 17.6, 17.8
- JM3 Appendix C (C.1-C.3)
- Optional: JM3 17.7
Tutorial:
- You can download the tutorial here. The tutorial mostly focuses on hidden Markov models and POS tagging. The last question is a "challenge" question, and you might not have sufficient time to go over it in detail. Still, you should aim to try and understand the basic idea behind it.
Week 6 checklist:
- Throughout: Try out the quizzes on Gradescope and prepare by reading the materials above before the lecture.
- Before your tutorial group meeting: Prepare for the meeting. If you did last week's reading, it should take around 30-60 minutes to prepare for your group, so don't spend ages. But you will need to do the reading and consider the questions in advance!