Difference between revisions of "Winter 2021 CS291A Syllabus"

From courses
Jump to: navigation, search
Line 23: Line 23:
 
*2/3 Attention mechanisms
 
*2/3 Attention mechanisms
 
** [https://www.aclweb.org/anthology/D15-1044.pdf A Neural Attention Model for Sentence Summarization]
 
** [https://www.aclweb.org/anthology/D15-1044.pdf A Neural Attention Model for Sentence Summarization]
 +
** [https://arxiv.org/abs/1409.0473 Neural Machine Translation by Jointly Learning to Align and Translate]
 
*2/8 Transformer and BERT (Mid-term report due [https://forms.gle/3mLA46FANTDZ5s5FA submission link])
 
*2/8 Transformer and BERT (Mid-term report due [https://forms.gle/3mLA46FANTDZ5s5FA submission link])
 
** [https://arxiv.org/abs/1906.08237 XLNet: Generalized Autoregressive Pretraining for Language Understanding]
 
** [https://arxiv.org/abs/1906.08237 XLNet: Generalized Autoregressive Pretraining for Language Understanding]

Revision as of 20:15, 1 January 2021