Course Description

This course provides an introduction to foundational methods and models in computational linguistics across a range of domains and applications. We will work with several probabilistic symbolic models such as N-grams, Maximum Entropy grammars, and (Probabilistic) Context Free grammars and apply these models to problems in the syntactic and phonological domains. We will see how the models are designed and trained and how they can be used to model various linguistic and psycholinguistic tasks, including parsing, gradient acceptability, and the modeling of language acquisition. Prior experience with the python programming language will be helpful but not required. 

Area Tags: Computational Linguistics, Learning Theory, Acquisition, Cognitive Science, Statistics, Phonology, Syntax, Psycholinguistics

(Sessions 1 & 2) Monday/Thursday 9:00-10:20

Location: ILC S331

Instructor: Gaja Jarosz

Gaja Jarosz is a professor of Linguistics at the University of Massachusetts Amherst. She received her PhD in Cognitive Science from the Johns Hopkins University in 2006 and her BA in Mathematics and Social Thought & Analysis from Washington University in St. Louis in 2001. She works in the areas of phonological theory, computational linguistics, and language learning and development. Using a combination of computational, corpus, and experimental methods, her research seeks to understand how natural language sound systems and their acquisition can be formally and computationally characterized, and what representations and constraints underlie the language acquisition process and the linguistic systems we ultimately acquire.