Introduction to Deep Learning Research
CSCI-UA 480 075 · FALL 2025 · NYU COURANT INSTITUTE OF MATHEMATICAL SCIENCES
| INSTRUCTOR | Alfredo Canziani |
| LECTURES | Tue/Thu 12:00 – 13:45 |
| CODE | 2025 repo |
| BLACKBOARDS | Google Drive |
| READINGS | Google Drive |
| SLIDES | Google Drive |
This second offering of my new course is meant to be an introduction to Deep Learning research for undergraduate (or advanced high school) students.
The aim of this course is to get the students fluent in reasoning, using:
- maths (linear algebra, calculus, logic),
- diagrams and schematics (abstract graphical language),
- graphs (function plotting and asymptotic behaviour),
- physics (reducing systems to their base parts to identify emerging collective behaviours), and
- coding (empirical verification of proposed hypothesis).
To test the students’ knowledge, the course uses 6 quizzes throughout the semester, 4 homework assignments, 2 projects, and a final oral exam, where students are examined on final project significance and originality, project presentation and defence, course content knowledge, and communication effectiveness.
Selected final projects and code written in class can be found in the GitHub repo; slides, blackboards, and suggested readings can be found on Google Drive. All links are provided at the top of this web page.
Lectures
Legend: 🖥 slides, 📝 notes, 📓 Jupyter notebook, 🎥 YouTube video.
Lesson 01: Course intro + McCulloch & Pitts binary neuron 🎥
Using maths & coding as languages of research 📐💻
Suggested readings
- Whitehead & Russell (1910) Principia mathematica
- McCulloch & Pitts (1943) A logical calculus of the ideas immanent in nervous activity
- Iverson bracket
Suggested videos
- Choi (2011) Sound of neurons
- Mahdid (2025) Exploring “Logical Calculus of Nervous Activity” by McCulloch & Pitts
Lesson 02: Programming a neural network 🎥
Behaviour by design using weights computed with maths 📐🧠
Suggested readings
- Summerfield (2025) These strange new minds
Lesson 03: Wiener’s cybernetics, Hebbian plasticity, and Rosenblatt’s perceptron 🎥
When physical machines start learning 🔁
Suggested readings
- Wiener (1948) Cybernetics
- Whitehead & Russell (1910) Principia mathematica
- Gertner (2012) The idea factory
- Mauchly & Eckert (1945) ENIAC
- Computer terminal
- Intel (1974) Intel 8080
- Monty Python (1969) Monty Python’s Flying Circus
- Monty Python (1970) Spam (Monty Python sketch)
- van Rossum (1991) Python
Lesson 04: Bias, perceptron properties, and multi-class classification 🎥
Bias shifts the boundary; more neurons slice the world 📐🧠
Lesson 05: A softer perceptron, part I: probabilities 🎥
Replacing certainty 🌗 with a degree of belonging 📊
Suggested readings