2015-01-02
[public] 144K views, 1.55K likes, 24.0 dislikes audio only
After all that work it's finally time to train our Neural Network. We'll use the BFGS numerical optimization algorithm and have a look at the results.
Supporting Code:
https://github.com/stephencwelch/Neural-Networks-Demystified
Yann Lecun's Efficient BackProp Paper: http://yann.lecun.com/exdb/publis/pdf/lecun-98b.pdf
More on BFGS:
http://en.wikipedia.org/wiki/Broyden%E2%80%93Fletcher%E2%80%93Goldfarb%E2%80%93Shanno_algorithm
In this series, we will build and train a complete Artificial Neural Network in python. New videos every other friday.
Part 1: Data + Architecture
Part 2: Forward Propagation
Part 3: Gradient Descent
Part 4: Backpropagation
Part 5: Numerical Gradient Checking
Part 6: Training
Part 7: Overfitting, Testing, and Regularization
Follow me on Twitter for updates:
@stephencwelch