Perceptron. This bit of novice programming implements several varieties of on-line learning, as well as agent-based and iterative learning for simulations of language change. It’s known to be a little buggy; hopefully a better version of this will be available soon. A fuller description and instructions for use are included as comments in the script, which can be found here. Sample files can be downloaded as a zipped folder by clicking here – see the Examples page to the right for explanations. Used for, amongst other things, the simulations in these papers:
Pater, Joe. 2012. Emergent systemic simplicity (and complexity). In J. Loughran and A. McKillen (eds.) Proceedings from Phonology in the 21st Century: In Honour of Glyne Piggott. McGill Working Papers in Linguistics 22(1). http://www.mcgill.ca/mcgwpl/archives/volume-221-2012
Pater, Joe and Elliott Moreton. 2012. Structurally biased phonology: Complexity in learning and typology. In a special issue of the EFL Journal on phonology, edited by K.G. Vijayakrishnan (The Journal of the English and Foreign Languages University, Hyderabad), 1-44.
Gradient descent. A fully batch gradual learner implemented in a simple R script. You’ll need the data table package. Includes an input file to replicate Jager’s 2007 Dutch simulation (itself a replication of Boersma and Levelt 2000). Download the script and input file as a zip folder here. Discussed in this paper:
Pater Joe and Robert Staubs. 2013. Modeling learning trajectories with batch gradient descent. Paper presented October 27th to the Northeast Computational Phonology Circle, MIT.
Also used in some of the exploratory work for this paper – I think all of the final simulations presented in the paper were down by Elliott Moreton in his own software:
Moreton, Elliott, Joe Pater and Katya Pertsova. 2015. Phonological concept learning. Cognitive Science. 1-66. DOI: 10.1111/cogs.12319