AIMLSAGA
Course Content
Module 1 of 2

Neural Networks from Scratch

0/1
0% complete
1. Backpropagation from Scratch+75 XP
Module 1/2 · Lesson 1/1
Backpropagation from Scratch
deep-learning
beginner
+75 XP

Backpropagation from Scratch

Backpropagation is the algorithm for computing gradients in neural networks using the chain rule of calculus.

Forward Pass

Given input xx, weights WW, biases bb:

  • Pre-activation: z=Wx+bz = Wx + b
  • Activation: a=σ(z)a = \sigma(z)
  • Loss: L=1ni(y^iyi)2L = \frac{1}{n}\sum_i (\hat{y}_i - y_i)^2

Backward Pass (Chain Rule)

LW=LaazzW\frac{\partial L}{\partial W} = \frac{\partial L}{\partial a} \cdot \frac{\partial a}{\partial z} \cdot \frac{\partial z}{\partial W}

Gradient Descent Update

WWηLWW \leftarrow W - \eta \cdot \frac{\partial L}{\partial W}