Towards Understandable Neural Networks for High Level AI Tasks – Part 7

Relating tensor product representations to lamda-calculus, tree-adjoining grammars, other ‘vector symbolic architectures’, and the brain – Part 7 Topics that will be discussed in this final lecture of the series are: – programming tensor-product-representation-manipulating Gradient-Symbolic-Computation networks to perform function-application in l-calculus and tree-adjunction (as in Tree-Adjoining Grammar) – thereby demonstrating that GSC networks truly have complete symbol-processing (or ‘algebraic’) capabilities, which Gary Marcus and others have argued (at MSR and elsewhere) are required for neural networks (artificial or biological) to achieve genuine human intelligence. – comparison of the size of tensor product representations to the size of other schemes for encoding symbol structures in actual neural network models: contrary to many claims, tensor product representations are not larger – preliminary neural evidence for tensor product representations (in particular, distributed role vectors)

Speaker Details

Paul Smolensky is Krieger-Eisenhower Professor of Cognitive Science at Johns Hopkins University. His research addresses mathematical unification of the continuous and the discrete facets of cognition: principally, the development of grammar formalisms that are grounded in cognitive and neural computation. A member of the Parallel Distributed Processing (PDP) Research Group at UCSD (1986), he developed Harmony Theory, proposing what is now known as the ‘Restricted Boltzmann Machine’ architecture. He then developed Tensor Product Representations (1990), a compositional, recursive technique for encoding symbol structures as real-valued activation vectors. Combining these two theories, he co-developed Harmonic Grammar (1990) and Optimality Theory (1993), general grammatical formalisms now widely used in phonological theory. His publications include the books Mathematical perspectives on neural networks (1996, with M. Mozer, D. Rumelhart), Optimality Theory: Constraint interaction in generative grammar (1993/2004, with A. Prince), Learnability in Optimality Theory (2000, with B. Tesar), and The harmonic mind: From neural computation to optimality-theoretic grammar (2006, with G. Legendre). He was awarded the 2005 David E. Rumelhart Prize for Outstanding Contributions to the Formal Analysis of Human Cognition, a Blaise Pascal Chair in Paris (2008-9), and the 2015 Sapir Professorship of the Linguistic Society of America.

Webpage: http://cogsci.jhu.edu/people/smolensky.html

Date:
Speakers:
Paul Smolensky
Affiliation:
Microsoft / Johns Hopkins University
    • Portrait of Jeff Running

      Jeff Running

Series: Microsoft Research Talks