Lesson 1 of 10% complete
Lesson 1
+75 XPBackpropagation from Scratch
Backpropagation from Scratch
Backpropagation is the algorithm for computing gradients in neural networks using the chain rule of calculus.
Forward Pass
Given input $x$, weights $W$, biases $b$:
- Pre-activation: $z = Wx + b$
- Activation: $a = \sigma(z)$
- Loss: $L = \frac{1}{n}\sum_i (\hat{y}_i - y_i)^2$
Backward Pass (Chain Rule)
$$\frac{\partial L}{\partial W} = \frac{\partial L}{\partial a} \cdot \frac{\partial a}{\partial z} \cdot \frac{\partial z}{\partial W}$$
Gradient Descent Update
$$W \leftarrow W - \eta \cdot \frac{\partial L}{\partial W}$$
Code Sandbox
Python 3.11
Simulated Runtime
sandbox.py
python