video thumbnail 19:13
But what is a neural network? | Chapter 1, Deep learning

2017-10-05

[public] 9.49M views, 417K likes, 2.16K dislikes audio only

channel thumb3Blue1Brown

What are the neurons, why are there layers, and what is the math underlying it?

Help fund future projects: https://www.patreon.com/3blue1brown

Written/interactive form of this series: https://www.3blue1brown.com/topics/neural-networks

Additional funding for this project provided by Amplify Partners

Typo correction: At 14 minutes 45 seconds, the last index on the bias vector is n, when it's supposed to in fact be a k. Thanks for the sharp eyes that caught that!

For those who want to learn more, I highly recommend the book by Michael Nielsen introducing neural networks and deep learning: https://goo.gl/Zmczdy

There are two neat things about this book. First, it's available for free, so consider joining me in making a donation Nielsen's way if you get something out of it. And second, it's centered around walking through some code and data which you can download yourself, and which covers the same example that I introduce in this video. Yay for active learning!

https://github.com/mnielsen/neural-networks-and-deep-learning

I also highly recommend Chris Olah's blog: http://colah.github.io/

For more videos, Welch Labs also has some great series on machine learning:

/youtube/video/i8D90DkCLhI

/youtube/video/bxe2T-V8XRs

For those of you looking to go *even* deeper, check out the text "Deep Learning" by Goodfellow, Bengio, and Courville.

Also, the publication Distill is just utterly beautiful: https://distill.pub/

Lion photo by Kevin Pluck

Thanks to these viewers for their contributions to translations

German: @fpgro

Hebrew: Omer Tuchfeld

Hungarian: Máté Kaszap

Italian: @teobucci, Teo Bucci

-----------------

Timeline:

0:00 - Introduction example

1:07 - Series preview

2:42 - What are neurons?

3:35 - Introducing layers

5:31 - Why layers?

8:38 - Edge detection example

11:34 - Counting weights and biases

12:30 - How learning relates

13:26 - Notation and linear algebra

15:17 - Recap

16:27 - Some final words

17:03 - ReLU vs Sigmoid

Correction 14:45 - The final index on the bias vector should be "k"

------------------

Animations largely made using manim, a scrappy open source python library. https://github.com/3b1b/manim

If you want to check it out, I feel compelled to warn you that it's not the most well-documented tool, and has many other quirks you might expect in a library someone wrote with only their own use in mind.

Music by Vincent Rubinetti.

Download the music on Bandcamp:

https://vincerubinetti.bandcamp.com/album/the-music-of-3blue1brown

Stream the music on Spotify:

https://open.spotify.com/album/1dVyjwS8FBqXhRunaG5W5u

If you want to contribute translated subtitles or to help review those that have already been made by others and need approval, you can click the gear icon in the video and go to subtitles/cc, then "add subtitles/cc". I really appreciate those who do this, as it helps make the lessons accessible to more people.

------------------

3blue1brown is a channel about animating math, in all senses of the word animate. And you know the drill with YouTube, if you want to stay posted on new videos, subscribe, and click the bell to receive notifications (if you're into that).

If you are new to this channel and want to see more, a good place to start is this playlist: http://3b1b.co/recommended

Various social media stuffs:

Website: https://www.3blue1brown.com

Twitter: https://twitter.com/3Blue1Brown

Patreon: https://patreon.com/3blue1brown

Facebook: https://www.facebook.com/3blue1brown

Reddit: https://www.reddit.com/r/3Blue1Brown


Linear transformations and matrices | Chapter 3, Essence of linear algebra by 3Blue1Brown
/youtube/video/kYB8IZa5AuE
Support on Patreon Support
https://www.patreon.com/3blue1brown
Introduction example
/youtube/video/aircAruvnKk?t=0
Series preview
/youtube/video/aircAruvnKk?t=67
What are neurons?
/youtube/video/aircAruvnKk?t=162
Introducing layers
/youtube/video/aircAruvnKk?t=215
Why layers?
/youtube/video/aircAruvnKk?t=331
Edge detection example
/youtube/video/aircAruvnKk?t=518
Counting weights and biases
/youtube/video/aircAruvnKk?t=694
How learning relates
/youtube/video/aircAruvnKk?t=750
Notation and linear algebra
/youtube/video/aircAruvnKk?t=806
Recap
/youtube/video/aircAruvnKk?t=917
Some final words
/youtube/video/aircAruvnKk?t=987
ReLU vs Sigmoid
/youtube/video/aircAruvnKk?t=1023
3Blue1Brown 3Blue1Brown, by Grant Sanderson, is some combination of math and entertainment, depending on your disposition. The goal is for explanations to be driven by animations and for difficult problems to be made simple with changes in perspective. For more information, other projects, FAQs, and inquiries see the website: https://www.3blue1brown.com
/youtube/channel/UCYO_jab_esuFRV4b17AJtAw
Essence of linear algebra by 3Blue1Brown
/youtube/video/fNk_zzaMoSs
Support on patreon patreon.com
https://www.patreon.com/3blue1brown
Gradient descent, how neural networks learn | Chapter 2, Deep learning 5,417,224 views
/youtube/video/IHZwWFHWa-w