Difference between revisions of "Winter 2021 CS291A Syllabus"

From courses
Jump to: navigation, search
Line 16: Line 16:
 
** [https://arxiv.org/pdf/1410.3916.pdf Memory Networks]
 
** [https://arxiv.org/pdf/1410.3916.pdf Memory Networks]
 
*1/27 Sequence-to-sequence models
 
*1/27 Sequence-to-sequence models
 +
** []
 +
** []
 
** [https://www.aclweb.org/anthology/N19-4009/ fairseq: A Fast, Extensible Toolkit for Sequence Modeling]
 
** [https://www.aclweb.org/anthology/N19-4009/ fairseq: A Fast, Extensible Toolkit for Sequence Modeling]
 
*2/1 Convolutional Neural Networks (HW1 due and HW2 out)
 
*2/1 Convolutional Neural Networks (HW1 due and HW2 out)
 
** [https://www.nature.com/articles/s41586-019-1923-7 Improved protein structure prediction using potentials from deep learning]
 
** [https://www.nature.com/articles/s41586-019-1923-7 Improved protein structure prediction using potentials from deep learning]
 
*2/3 Attention mechanisms
 
*2/3 Attention mechanisms
 +
** [https://www.aclweb.org/anthology/D15-1044.pdf A Neural Attention Model for Sentence Summarization]
 
*2/8 Transformer and BERT (Mid-term report due [https://forms.gle/3mLA46FANTDZ5s5FA submission link])
 
*2/8 Transformer and BERT (Mid-term report due [https://forms.gle/3mLA46FANTDZ5s5FA submission link])
 
** [https://arxiv.org/abs/1906.08237 XLNet: Generalized Autoregressive Pretraining for Language Understanding]
 
** [https://arxiv.org/abs/1906.08237 XLNet: Generalized Autoregressive Pretraining for Language Understanding]

Revision as of 20:13, 1 January 2021