DEEP LEARNING
DS-GA 1008 · SPRING 2021 · NYU CENTER FOR DATA SCIENCE
INSTRUCTORS | Yann LeCun & Alfredo Canziani |
LECTURES | Wednesday 9:30 – 11:30, Zoom |
PRACTICA | Tuesdays 9:30 – 10:30, Zoom |
FORUM | r/NYU_DeepLearning |
DISCORD | NYU DL |
MATERIAL | 2021 repo |
2021 edition disclaimer
Check the repo’s README.md
and learn about:
- Content new organisation
- The semester’s second half intellectual dilemma
- This semester repository
- Previous releases
Lectures
Most of the lectures, labs, and notebooks are similar to the previous edition, nevertheless, some are brand new. I will try to make clear which is which.
Legend: 🖥 slides, 📝 notes, 📓 Jupyter notebook, 🎥 YouTube video.
Theme 1: Introduction
- History and resources 🎥 🖥
- Gradient descent and the backpropagation algorithm 🎥 🖥
- Neural nets inference 🎥 📓
- Modules and architectures 🎥 🖥
- Neural nets training 🎥 🖥 📓 📓
- Homework 1: backprop
Theme 2: Parameters sharing
- Recurrent and convolutional nets 🎥 🖥 📝
- ConvNets in practice 🎥 🖥 📝
- Natural signals properties and the convolution 🎥 🖥 📓
- Recurrent neural networks, vanilla and gated (LSTM) 🎥 🖥 📓 📓
- Homework 2: RNN & CNN
Theme 3: Energy based models, foundations
- Energy based models (I) 🎥 🖥
- Inference for LV-EBMs 🎥 🖥
- What are EBMs good for? 🎥
- Energy based models (II) 🎥 🖥 📝
- Training LV-EBMs 🎥 🖥
- Homework 3: structured prediction
Theme 4: Energy based models, advanced
- Energy based models (III) 🎥 🖥
- Unsup learning and autoencoders 🎥 🖥
- Energy based models (VI) 🎥 🖥
- From LV-EBM to target prop to (any) autoencoder 🎥 🖥
- Energy based models (V) 🎥 🖥
- AEs with PyTorch and GANs 🎥 🖥 📓 📓
- Joint Embedding Methods (I) 🎥 🖥 🖥
- Joint Embedding Methods (II) 🎥 🖥
Theme 5: Associative memories
Theme 6: Graphs
- Graph transformer nets [A][B] 🎥 🖥
- Graph convolutional nets (I) [from last year] 🎥 🖥
- Graph convolutional nets (II) 🎥 🖥 📓