Saturday, May 27, 2017

Lecture 13: semantic parsing; statistical and neural machine translation

Semantic parsing: supervised, unsupervised, semi-supervised. FrameNet. Abstract Meaning Representations.

Introduction to Machine Translation. Rule-based vs. Statistical MT. Statistical MT: the noisy channel model. The language model and the translation model. The phrase-based translation model. Learning a model of training. Phrase-translation tables. Parallel corpora. Extracting phrases from word alignments. Word alignments. IBM models for word alignment. Many-to-one and many-to-many alignments. IBM model 1 and the HMM alignment model. Training the alignment models: the Expectation Maximization (EM) algorithm. Symmetrizing alignments for phrase-based MT: symmetrizing by intersection; the growing heuristic. Calculating the phrase translation table. Decoding: stack decoding. Evaluation of MT systems. BLEU. Neural MT: the encoder-decoder architecture; advantages; results.


Friday, May 19, 2017

Lecture 12: Entity linking; semantic similarity; sense embedding; semantic parsing; project presentation

Entity Linking. Main approaches. AIDA, TagMe, Wikifier, DBpedia spotlight, Babelfy. The MASC annotated corpus. Semantic similarity. Sense embeddings. Project presentation.

Friday, May 12, 2017

Lecture 11: Word Sense Disambiguation

Introduction to Word Sense Disambiguation (WSD). Motivation. The typical WSD framework. Lexical sample vs. all-words. WSD viewed as lexical substitution and cross-lingual lexical substitution. Knowledge resources. Representation of context: flat and structured representations. Main approaches to WSD: Supervised, unsupervised and knowledge-based WSD. Two important dimensions: supervision and knowledge. Supervised Word Sense Disambiguation: pros and cons. Vector representation of context. Main supervised disambiguation paradigms: decision trees, neural networks, instance-based learning, Support Vector Machines, IMS with embeddings, neural approaches to WSD. Unsupervised Word Sense Disambiguation: Word Sense Induction. Context-based clustering. Co-occurrence graphs: curvature clustering, HyperLex. Knowledge-based Word Sense Disambiguation. The Lesk and Extended Lesk algorithm. Structural approaches: similarity measures and graph algorithms. Conceptual density. Structural Semantic Interconnections. Evaluation: precision, recall, F1, accuracy. Baselines. Entity Linking.

Sunday, May 7, 2017

Lecture 10: computational semantics (2/2)

Encoding word senses: paper dictionaries, thesauri, machine-readable dictionary, computational lexicons. WordNet. Wordnets in other languages. Problems of wordnets. BabelNet. Presentation of the third homework: question-answer pair extraction.



Friday, April 28, 2017

Lecture 9: syntactic parsing (2/2); intro to computational semantics (1/2)

The Earley algorithm. Probabilistic CFGs (PCFGs). PCFGs for disambiguation: the probabilistic CKY algorithm. PCFGs for language modeling. Introduction to computational semantics. Syntax-driven semantic analysis. Semantic attachments. First-Order Logic. Lambda notation and lambda calculus for semantic representation. Lexicon, lemmas and word forms. Word senses: monosemy vs. polysemy. Special kinds of polysemy. Computational sense representations: enumeration vs. generation. Graded word sense assignment.

Friday, April 21, 2017

Lecture 8: syntactic parsing

Introduction to syntax. Context-free grammars and languages. Treebanks. Normal forms. Dependency grammars. Syntactic parsing: top-down and bottom-up. Structural ambiguity. Backtracking vs. dynamic programming for parsing. The CKY algorithm. Neural transition-based dependency parsing.

Friday, April 7, 2017

Lecture 7: part-of-speech tagging

Introduction to part-of-speech (POS) tagging. POS tagsets: the Penn Treebank tagset and the Google Universal Tagset. Rule-based POS tagging. Stochastic part-of-speech tagging. Hidden markov models. Deleted interpolation. Linear and logistic regression: Maximum Entropy models. Transformation-based POS tagging. Handling out-of-vocabulary words. The Stanford POS tagger. Neural POS tagging with bidirection LSTMs. Presentation of homework 2.


Friday, March 31, 2017

Lecture 6: deep learning; intro to part of speech tagging

Recurrent Neural Networks and Long-Short Term Memory networks. Practical session on character-based LSTMs with Keras. Introduction to part-of-speech tagging.


Monday, March 27, 2017

Lecture 5: practical session on Keras; more on NNs for NLP; word embeddings

Practical session on Keras. More on NNs for NLP: hierarchical softmax; negative sampling. Vector representations. Word2vec. Word embeddings and their properties.


Friday, March 17, 2017

Lecture 4: language modeling (2); neural networks and NLP

We discussed perplexity and its close relationship with entropy, we introduced smoothing and interpolation techniques to deal with the issue of data sparsity. Practical session on language modeling with Python and the Berkeley LM toolkit.

Friday, March 10, 2017

Lecture 3: morphological analysis: practical session; homework 1; language modeling (1)

We had a practical session on morphological analysis in Python and Java. We reviewed basic probability concepts. introduced N-gram models (unigrams, bigrams, trigrams), together with their probability modeling and issues.

We also discussed homework 1 (see post on the class group).

Friday, March 3, 2017

Lecture 2: intro (2); morphological analysis

We introduced words and morphemes. Before delving into morphology and morphological analysis, we introduced regular expressions as a powerful tool to deal with different forms of a word. We then introduced recent work on morphological analysis based on machine learning: unsupervised (Morfessor) and supervised (based on CRFs).


Saturday, February 25, 2017

Lecture 1: Introduction to NLP

We gave an introduction to the course and the field it is focused on, i.e., Natural Language Processing, with a focus on the Turing Test as a tool to understand whether "machines can think". We also discussed the pitfalls of the test, including Searle's Chinese Room argument.


Thursday, January 19, 2017

Ready, steady, go!

Welcome to the Sapienza NLP course blog! This year there will be important changes: first, projects will be lightweight for attending students; second, homeworks will be part of the final project (in this respect, attending students will complete more than 50% of their projects before the end of the course); third, the class will be updated on the newest trends in neural networks; fourth: this year the (class) project will be... the development of an intelligent chatbot working on Telegram!
IMPORTANT: The 2017 class hour schedule will be on Fridays 2.30pm-5.45pm. Please sign up to the NLP class!