Difference between revisions of "Winter 2021 CS291A Syllabus"

From courses
Jump to: navigation, search
Line 11: Line 11:
 
*1/25 LSTMs/GRUs
 
*1/25 LSTMs/GRUs
 
** [https://arxiv.org/pdf/1802.05365.pdf Deep contextualized word representations]
 
** [https://arxiv.org/pdf/1802.05365.pdf Deep contextualized word representations]
 +
** [https://arxiv.org/pdf/1801.06146.pdf Universal Language Model Fine-tuning for Text Classification]
 
*1/27 Sequence-to-sequence models
 
*1/27 Sequence-to-sequence models
 
*2/1 Convolutional Neural Networks (HW1 due and HW2 out)
 
*2/1 Convolutional Neural Networks (HW1 due and HW2 out)
Line 16: Line 17:
 
*2/3 Attention mechanisms
 
*2/3 Attention mechanisms
 
*2/8 Transformer and BERT (Mid-term report due [https://forms.gle/3mLA46FANTDZ5s5FA submission link])
 
*2/8 Transformer and BERT (Mid-term report due [https://forms.gle/3mLA46FANTDZ5s5FA submission link])
*2/10 BERT (continued)
+
** [https://arxiv.org/abs/1906.08237 XLNet: Generalized Autoregressive Pretraining for Language Understanding]
 +
** [https://arxiv.org/abs/1907.11692 RoBERTa: A Robustly Optimized BERT Pretraining Approach]
 +
** [ALBERT: A Lite BERT for Self-supervised Learning of Language Representations]
 +
*2/10 Mid-term project updates
 
*2/15 University Holiday: Presidents' Day
 
*2/15 University Holiday: Presidents' Day
 
*2/17 Language and vision
 
*2/17 Language and vision

Revision as of 02:01, 1 January 2021