logo
Loading...

Lecture 5: Backpropagation and Project Advice - Stanford University School of Engineering - 深度學習 Deep Learning 公開課 - Cupoy

Lecture 5 discusses how neural networks can be trained using a distributed gradient descent techniqu...

Lecture 5 discusses how neural networks can be trained using a distributed gradient descent technique known as back propagation. Key phrases: Neural networks. Forward computation. Backward propagation. Neuron Units. Max-margin Loss. Gradient checks. Xavier parameter initialization. Learning rates. Adagrad.