CSE 490 U - Natural Language Processing (original) (raw)

Instructor: Yejin Choi (yejin at cs dot washington dot edu) Office hours: Wed 4:30pm - 5:30pm at CSE 578 (and by appointment) TA: Luheng He (luheng at cs dot washington dot edu) Office hours: Tue 5pm - 5:45pm at CSE 218 TA: Maarten Sap (msap at cs dot washington dot edu) Office hours: Thu 2pm - 2:45pm at CSE 218

Week

Dates

Topics & Lecture Slides

Notes (Required)

Textbook & Recommended Reading

1

Mar 28, 30, Apr 1

I. Introduction [Slides]
II. Words: Language Models (LMs) [Slides]

LM

J&M 4.1-4; M&S 6

2

Apr 4, 6, 8

II. Words: Language Models (LMs), Smoothing [Slides]
III. Sequences: Hidden Markov Models (HMMs) [Slides]

HMM

J&M 4.5-7; M&S 6

3

Apr 11, 13, 15

III. Sequences: Hidden Markov Models (HMMs) [Slides]
III. Sequences: Part-Of-Speech Tagging (skipped) [Slides]

Forward-backward

J&M 5.1-5.3; 6.1-6.5; M&S 9, 10.1-10.3

4

Apr 18, 20, 22

IV. Trees: Probabilistic Context Free Grammars (PCFG) [Slides]

PCFG

J&M 13-14; M&S 11-12

5

Apr 25, 27, 29

IV. Trees: PCFG Grammar Refinement [Slides]

Lexicalized PCFG, Inside-outside

J&M 13-14; M&S 11-12

6

May 2, 4, 6

IV. Trees: Dependency Grammars and Mildly Context-Sensitive Grammars [Slides]

Edmond-Chu-Liu;

7

May 9, 11, 13

V. Semantics: Frame Semantics [Slides];
V. Semantics: Distributed Semantics, Embeddings [Slides]

J&Mv3 Vector Semantics, Dense Vectors,
Frame Semantics

J&M 19.4; J&M 20.7

8

May 16, 18, 20

VI. Learning: Log-Linear Models, Conditional Random Fields (CRFs) [Slides]

LogLinear,MEMMs, CRFs

J&M 6.6 - 6.8

9

May 23, 25, 27

VI. Learning: Deep Learning[Slides]

Russell & Norvig Ch 18.7 ANNs; Bishop Ch 5 ANNs;

10

(May 30), Jun 1, Jun 3

VII. Translation: Alignment Models & Phrase-based MT [Slides]

IBM Models 1 and 2, Phrase MT, EM

J&M 25; M&S 13

CSE logo Department of Computer Science & Engineering University of Washington Box 352350 Seattle, WA 98195-2350 (206) 543-1695 voice, (206) 543-2969 FAX