AIMLSAGA
1,250 XP · Lv.13
AR
Lv.13
50/100 XP · 50 to Lv.14
7 day streak
Neural Networks from Scratch
deep-learning
beginner
Lesson 1 of 10% complete
Lesson 1
Backpropagation from Scratch
+75 XP

Backpropagation from Scratch

Backpropagation is the algorithm for computing gradients in neural networks using the chain rule of calculus.

Forward Pass

Given input $x$, weights $W$, biases $b$:

  • Pre-activation: $z = Wx + b$
  • Activation: $a = \sigma(z)$
  • Loss: $L = \frac{1}{n}\sum_i (\hat{y}_i - y_i)^2$

Backward Pass (Chain Rule)

$$\frac{\partial L}{\partial W} = \frac{\partial L}{\partial a} \cdot \frac{\partial a}{\partial z} \cdot \frac{\partial z}{\partial W}$$

Gradient Descent Update

$$W \leftarrow W - \eta \cdot \frac{\partial L}{\partial W}$$


Code Sandbox
Python 3.11
Simulated Runtime
sandbox.py
python