video thumbnail 7:56
Neural Networks Demystified [Part 4: Backpropagation]

2014-12-05

[public] 455K views, 5.45K likes, 275 dislikes audio only

Backpropagation as simple as possible, but no simpler. Perhaps the most misunderstood part of neural networks, Backpropagation of errors is the key step that allows ANNs to learn. In this video, I give the derivation and thought processes behind backpropagation using high school level calculus.

Supporting Code and Equations:

https://github.com/stephencwelch/Neural-Networks-Demystified

In this series, we will build and train a complete Artificial Neural Network in python. New videos every other friday.

Part 1: Data + Architecture

Part 2: Forward Propagation

Part 3: Gradient Descent

Part 4: Backpropagation

Part 5: Numerical Gradient Checking

Part 6: Training

Part 7: Overfitting, Testing, and Regularization

@stephencwelch