Final Exam Study Guide | ENLP Spring 2019

This page provides a list of concepts you should be familiar with and questions you should be able to answer if you are thoroughly familiar with the material in the course. It is safe to assume that if you have a good grasp of everything listed here and the midterm study guide, you will do well on the exam. However, we cannot guarantee that only the topics mentioned here, and nothing else, will appear on the exam.

How to review

You should review the lecture slides, quizzes, and homework assignments. The readings should be helpful as well. If there are topics that you are not clear on from these resources, please ask on the discussion board or in office hours.

Assume the instructors will not be available to answer questions in the 48 hours preceding the exam.

Exam procedures

The exam will be completed without use of a laptop, calculator, or textbook/reference materials.

Scope of the final

Everything in the course is fair game. In addition to this study guide, it is therefore recommended that you review the midterm topics. The wrap-up slides from the last lecture summarize several major themes of the course.

Style of questions

The final will have a variety of question types. Be prepared for a greater number of short answer questions than in the midterm/quizzes. These may be broadly worded to allow flexibility in which specific representations/models/algorithms you use in your answer. Some parts of the exam may give you a choice of questions to answer.

Structured prediction algorithms

You should understand the Viterbi, CKY, and transition-based parsing algorithms well enough to illustrate them by hand and discuss their asymptotic complexity. Recall that Viterbi is used for sequence taggers (the HMM and structured perceptron), CKY is used for parsing with a CFG or PCFG, and transition-based parsing is most typically used for dependency parsing.

(This year, we did not really talk about beam search or graph-based dependency parsing, so you will not be asked about these techniques.)

Annotation

You should be able to answer questions about annotation concepts like

Grammars and syntax

We covered Hidden Markov Models (HMMs), Context-Free Grammars (CFGs), and Probabilistic Context-Free Grammars (PCFGs).

An HMM is a generative model over tagged words; a PCFG is a generative model over trees (nonterminals and terminals). As with the other generative models in this course (see midterm topics), you should be able to describe the independence assumptions and generative process, compute probabilities, etc.

(You will not be probed extensively on the Chomsky Hierarchy, but you should be aware that CFGs are strictly more expressive than regular expressions, and computationally more expensive to parse with. Both are classes of formal grammars.)

Semantic roles

Similarity and distributional representations

For example, you should be able to:

(This year we did not cover Brown clustering, so this will not be on the test.)

Neural networks

You should be familiar with

You will not be asked about

Applications and other topics

Other formulas

In addition to models and formulas discussed above, you should know the formulas for the following concepts, what they may be used for, and be able to apply them appropriately. Where relevant you should be able to discuss strengths and weaknesses of the associated method, and alternatives.