The First Part of The Course Will Be a Short Introduction to Machine Learning and The Challenges The Deep Learning Revolution of Recent Years Has Posed to The Classical Learning Theory. The Second and Main Part of The Course Will Focus Solely On The Approximation Aspect Of Deep Learning. Possible Topics Include Universality of Fully Connected Networks, Universality of Convolutional Networks And Invariant Networks, Approximation Rates For Smooth Functions, Relationship Between Neural Networks and More Classical Function Bases (finite Elements, Splines, Wavelets), and "power of Depth" Results Which Point to Functions That Can Be Approximated More Efficiently By Deep Neural Nets Than By Other Models. Learning Outcome# Students Who Successfully Complete This Course Will Be Able To# A. Know The Basic Concepts of Machine Learning and Learning Theory. B. Be Skilled to Read a State of The Art Research On Approximation Properties of Deep Neural Networks and Will Be Able to Conduct Independent Research in This Field.

Faculty: Mathematics
|Undergraduate Studies |Graduate Studies

Pre-required courses

(94411 - Probability (ie) and 104022 - Differential and Integral Calculus 2m) or (94412 - Probability (advanced) and 104032 - Calculus 2m) or (104013 - Differential and Integral Calculus 2t and 104034 - Introduction to Probability H) or (104222 - Probability Theory and 104295 - Infinitesimal Calculus 3)


Course with no extra credit

236763 - Deep Learning and Approximation Theory


Semestrial Information