The University of Massachusetts Amherst

The Perceptron — A Perceiving and Recognizing Automaton

Rosenblatt’s 1957 tech report introducing the perceptron.

It is citation 23 in  LeCun al. (2015: Nature) “Deep Learning” (It’s surprising they don’t cite the much more relevant 1961/1962 book). This paper, anyway, gives it as the original source for Artificial Neural Networks. A (2016) Deep Learning text seems to make the mistake that Rosenblatt was working only with single-layer nets (pp. 14-15, 27). But they do give him credit for creating the first trained nets (p. 15):

In the 1950s, the perceptron (Rosenblatt, 1958, 1962) became the ?rst model that could learn the weights de?ning the categories given examples of inputs from each category.

Bengio, one of the authors, has written about MLPs, so this is puzzling. It’s probably shorthand for “they existed, but backprop made them useful”.

A very useful overview of the modern use of perceptrons for language: Goldberg (2015) A Primer on Neural Network Models for Natural Language Processing. Socher (2015) on “Recursive Deep Learning” is a great example of Chomsky / Rosenblatt integration. This 2007 paper by Frank and Mathis has an overview of psycholinguistics connectionist syntax research, as does this 2008 paper by Levy (p. 1143).

Can recurrent neural networks learn natural language grammars?

Assessing the Ability of LSTMs to Learn Syntax-Sensitive Dependencies

On speech recognition, see the Minsky/Rosenblatt page for old and new history – also lots of further Rosenblatt materials.