All Courses
6 lessons
Build a Neural Net
Neurons, layers, and backprop — wired by hand
Lessons
- 01
Single Neuron
A weighted sum, a nonlinearity, a prediction.
EasyOpen - 02
Backpropagation
The chain rule, made mechanical.
MediumOpen - 03
Multi-Layer Backpropagation
Chain rule across arbitrarily deep networks.
HardOpen - 04
Backprop Ninja
Derive backward for a 2-layer MLP by hand — checked live against finite differences.
HardOpen - 05
MLP from Scratch
A full multi-layer perceptron in pure NumPy.
MediumOpen - 06
Weight Initialization
Xavier, He, and the math of exploding gradients.
MediumOpen