NOUS
DashboardCoursesUploadAuthoringAnalyticsStudentsSettings
RK
Prof. Ramesh KumarPES University
All Courses
6 lessons

Build a Neural Net

Neurons, layers, and backprop — wired by hand

Lessons

  1. 01

    Single Neuron

    A weighted sum, a nonlinearity, a prediction.

    EasyOpen
  2. 02

    Backpropagation

    The chain rule, made mechanical.

    MediumOpen
  3. 03

    Multi-Layer Backpropagation

    Chain rule across arbitrarily deep networks.

    HardOpen
  4. 04

    Backprop Ninja

    Derive backward for a 2-layer MLP by hand — checked live against finite differences.

    HardOpen
  5. 05

    MLP from Scratch

    A full multi-layer perceptron in pure NumPy.

    MediumOpen
  6. 06

    Weight Initialization

    Xavier, He, and the math of exploding gradients.

    MediumOpen