Tools

OT-Help 2 provides tools for studying language typology in parallel and serial versions of Optimality Theory and Harmonic Grammar. The parallel component provides solvers for OT and HG as well as typology calculators facilitating easy comparison of theories. The serial components allow users to define their own operations in Gen and constraints in Con. These operations and constraints are used to compute the typology for a list of inputs. New hypotheses about Gen and Con can be evaluated quickly and easily. These typologies can be computed for both serial OT (Harmonic Serialism) or serial HG (serial Harmonic Grammar).


The Hidden Structure Suite consists of constraint-based hidden structure learning algorithms, including: Gradual Learning Algorithm (GLA) with Robust Interpretive Parsing (RIP), with Resampling RIP (RRIP), and Expected Interpretive Parsing (EIP). All three GLA parsing strategies can be used with either Stochastic OT or Noisy HG. Additionally, Expectation Driven Learning (EDL) is available for pairwise ranking grammars, with both batch and online versions. All learners can handle structural ambiguity (e.g. hidden prosodic structure), and the EDL learners also have the option of also learning underlying representations or Harmonic Serialism grammars (currently only in console mode).


The MaxEnt Scales Learner implements a Maximum Entropy Grammar with lexical scales, which enable the model to capture exceptional phonological patterns. This implementation (described more here) allows the user to control the learner’s algorithm (with a choice between gradient descent, gradient descent with “clipping”, and L-BFGS-B), whether negative constraint weights are allowed, what kind of prior is used (L1 or L2), and other basic hyperparameters (such as weight initialization, learning rate, and number of updates in training).


Brandon Prickett’s colab notebook with the software for his dissertation, Learning Phonology With Sequence-To-Sequence Neural Networks.


Brandon Prickett’s hidden structure maxent learner, used for Pater and Prickett (2022: AMP), and in ongoing research with Seung Suk Lee.