Lecture 11: Gated Recurrent Units and Further Topics in NMT - Stanford University School of Engineering - 深度學習 Deep Learning 公開課 - Cupoy
Lecture 11 provides a final look at gated recurrent units like GRUs/LSTMs followed by machine transl...
Lecture 11 provides a final look at gated recurrent units like GRUs/LSTMs followed by machine translation evaluation, dealing with large vocabulary output, and sub-word and character-based models. Also includes research highlight ""Lip reading sentences in the wild.""
Key phrases: Seq2Seq and Attention Mechanisms, Neural Machine Translation, Speech Processing