Friday, May 27, 2016

Lecture 12: statistical machine translation

Introduction to Machine Translation. Rule-based vs. Statistical MT. Statistical MT: the noisy channel model. The language model and the translation model. The phrase-based translation model. Learning a model of training. Phrase-translation tables. Parallel corpora. Extracting phrases from word alignments. Word alignments

IBM models for word alignment. Many-to-one and many-to-many alignments. IBM model 1 and the HMM alignment model. Training the alignment models: the Expectation Maximization (EM) algorithm. Symmetrizing alignments for phrase-based MT: symmetrizing by intersection; the growing heuristic. Calculating the phrase translation table. Decoding: stack decoding. Evaluation of MT systems. BLEU.

Saturday, May 21, 2016

Lecture 11: semantic parsing (2), AMR, research in Rome

Unsupervised semantic parsing, semi-supervised semantic parsing, Abstract Meaning Representation (AMR). NLP research in Rome.


Friday, May 13, 2016

Lecture 10: semantic role labeling and semantic parsing

PropBank, FrameNet, semantic role labeling. Introduction to semantic parsing. Presentation of the projects.

Friday, May 6, 2016

Lecture 9: Neural Networks, word embeddings and deep learning

Motivation. The perceptron. Input encoding, sum and activation functions; objective function. Linearity of the perceptron. Neural networks. Training. Backpropagation. Connection to Maximum Entropy. Connection to language. Vector representations. NN for the bigram language model. Word2vec: CBOW and skip-gram. Word embeddings. Deep learning. Language modeling with NN. The big picture.

Tuesday, May 3, 2016

Lecture 8: Entity Linking

Entity Linking. Main approaches. AIDA, TagMe, Wikifier, DBpedia spotlight, Babelfy. The MASC annotated corpus. Demo di sistemi di WSD e Entity Linking. Introduction to Neural Networks.