Representation Power of Neural Networks

This talk will survey a variety of classical results on the representation power of neural networks, and then close with a new result separating shallow and deep networks: namely, there exist classification problems where any shallow network needs exponentially as many nodes to match the accuracy of certain deep or recurrent networks

Speaker Details

Matus Telgarsky is a postdoc in EECS at University of Michigan.

Date:
Speakers:
Matus Telgarsky
Affiliation:
University of Michigan
    • Portrait of Jeff Running

      Jeff Running

Series: Microsoft Research Talks