NIPS: Oral Session 3 – Matthew Lawlor

Feedforward Learning of Mixture Models

We develop a biologically-plausible learning rule that provably converges to the class means of general mixture models. This rule generalizes the classical BCM neural rule within a tensor framework, substantially increasing the generality of the learning problem it solves. It achieves this by incorporating triplets of samples from the mixtures, which provides a novel information processing interpretation to spike-timing-dependent plasticity. We provide both proofs of convergence, and a close fit to experimental data on STDP.

Date:
Speakers:
Matthew Lawlor
Affiliation:
Yale University
    • Portrait of Lori Stone

      Lori Stone