Difference between revisions of "Winter 2021 CS291A Syllabus"
From courses
Line 11: | Line 11: | ||
*1/25 LSTMs/GRUs | *1/25 LSTMs/GRUs | ||
** [https://arxiv.org/pdf/1802.05365.pdf Deep contextualized word representations] | ** [https://arxiv.org/pdf/1802.05365.pdf Deep contextualized word representations] | ||
+ | ** [https://arxiv.org/pdf/1801.06146.pdf Universal Language Model Fine-tuning for Text Classification] | ||
*1/27 Sequence-to-sequence models | *1/27 Sequence-to-sequence models | ||
*2/1 Convolutional Neural Networks (HW1 due and HW2 out) | *2/1 Convolutional Neural Networks (HW1 due and HW2 out) | ||
Line 16: | Line 17: | ||
*2/3 Attention mechanisms | *2/3 Attention mechanisms | ||
*2/8 Transformer and BERT (Mid-term report due [https://forms.gle/3mLA46FANTDZ5s5FA submission link]) | *2/8 Transformer and BERT (Mid-term report due [https://forms.gle/3mLA46FANTDZ5s5FA submission link]) | ||
− | *2/10 | + | ** [https://arxiv.org/abs/1906.08237 XLNet: Generalized Autoregressive Pretraining for Language Understanding] |
+ | ** [https://arxiv.org/abs/1907.11692 RoBERTa: A Robustly Optimized BERT Pretraining Approach] | ||
+ | ** [ALBERT: A Lite BERT for Self-supervised Learning of Language Representations] | ||
+ | *2/10 Mid-term project updates | ||
*2/15 University Holiday: Presidents' Day | *2/15 University Holiday: Presidents' Day | ||
*2/17 Language and vision | *2/17 Language and vision |
Revision as of 02:01, 1 January 2021
- 1/4 Introduction, logistics, and deep learning.
- 1/6 Tips for a successful class project
- 1/11 Neural network basics, & backpropagation
- 1/13 Word embeddings (Project proposal due submission link, HW1 out)
- 1/18 University Holiday: Martin Luther King Jr. Day
- 1/20 RNNs
- 1/25 LSTMs/GRUs
- 1/27 Sequence-to-sequence models
- 2/1 Convolutional Neural Networks (HW1 due and HW2 out)
- 2/3 Attention mechanisms
- 2/8 Transformer and BERT (Mid-term report due submission link)
- XLNet: Generalized Autoregressive Pretraining for Language Understanding
- RoBERTa: A Robustly Optimized BERT Pretraining Approach
- [ALBERT: A Lite BERT for Self-supervised Learning of Language Representations]
- 2/10 Mid-term project updates
- 2/15 University Holiday: Presidents' Day
- 2/17 Language and vision
- 2/22 Deep Reinforcement Learning 1 (HW2 due: 02/26 Monday 11:59pm)
- 2/24 Deep Reinforcement Learning 2
- 3/1 Generative Adversarial Networks
- 3/3 Project: final presentation (1)
- 3/8 Project: final presentation (2)
- 3/10 Project: final presentation (3)
- 3/19 23:59PM PT Project Final Report Due submission link.