logo
Loading...

Backpropagation Explained - Siraj Raval - 深度學習 Deep Learning 公開課 - Cupoy

The most popular optimization strategy in machine learning is called gradient descent. When gradient...

The most popular optimization strategy in machine learning is called gradient descent. When gradient descent is applied to neural networks, its called back-propagation. In this video, i'll use analogies, animations, equations, and code to give you an in-depth understanding of this technique. Once you feel comfortable with back-propagation, everything else becomes easier. It uses calculus to help us update our machine learning models. Enjoy!