DEEP LEARNING
DS-GA 1008 · FALL 2022 · NYU CENTER FOR DATA SCIENCE
INSTRUCTOR | Alfredo Canziani, Yann LeCun |
LECTURES | Wednesday 16:55 – 18:55, Zoom |
PRACTICA | Tuesdays 16:55 – 17:55, Zoom |
FORUM | r/NYU_DeepLearning |
DISCORD | NYU DL |
MATERIAL | 2022 repo |
2022 edition disclaimer
Check the repo’s README.md
and learn about:
- New content and presentation
- This semester repository
- Previous releases
Lectures
Only the new lessons (either material or presentation) will come online. Context similar to the SP21 edition, semitransparent and shown in italic, is not going to be edited and/or pushed online.
Legend: 🖥 slides, 📝 notes, 📓 Jupyter notebook, 🎥 YouTube video.
Theme 1: Introduction
- 00 – Introduction to NYU-DLFL22 🎥
- 01 – History (see 🎥)
- 02 – Gradient descent and the backpropagation algorithm (see 🎥)
- 03 – Resources and neural nets inference 🎥
Theme 2: Classification, an energy perspective
- 05 – Notation and introduction 🎥 🖥
- 06 – Backprop and contrastive learning 🎥 🖥
- 07 – PyTorch 5-step training code 🎥 🖥
Theme 3: Parameter sharing
- 04 – Recurrent and convolutional nets (see 🎥 🖥 📝 )
- 08 – Natural signals, ConvNets kernels and sizes, comparison with fully-connected architecture (see 🎥 🖥 📓 and 🎥)
- 09 – Recurrent neural nets, vanilla and gated (LSTM) 🎥 🖥 📓📓 ①
Theme 4: Energy-based models, a compendium
- 11 – Inference for latent variable energy-based models (LV-EBMs) 🎥 🖥
- 13 – Training LV-EBMs 🎥 🖥
- 14 – From latent-variable EBMs (K-means, sparse coding), to target propagation to autoencoders 🎥 🖥
① I did create some new RNN diagrams (see tweet and quoted one), so this lesson may get published, at some time. For now I’m focussing on the energy lessons first.