Difference between revisions of "Winter 2021 CS291A Syllabus"
From courses
Line 1: | Line 1: | ||
− | * | + | *1/4 Introduction, logistics, and deep learning. |
− | * | + | *1/6 Tips for a successful class project |
− | * | + | *1/11 Neural network basics, & backpropagation |
− | * | + | *1/13 Word embeddings (Project proposal due [https://forms.gle/TjYSjc5iE1Zm24ED8 submission link], HW1 out) |
** [https://www.aclweb.org/anthology/Q17-1010/ Enriching Word Vectors with Subword Information] | ** [https://www.aclweb.org/anthology/Q17-1010/ Enriching Word Vectors with Subword Information] | ||
** [https://www.aclweb.org/anthology/C18-1139/ Contextual String Embeddings for Sequence Labeling] | ** [https://www.aclweb.org/anthology/C18-1139/ Contextual String Embeddings for Sequence Labeling] | ||
** [https://www.aclweb.org/anthology/D14-1162/ GloVe: Global Vectors for Word Representation] | ** [https://www.aclweb.org/anthology/D14-1162/ GloVe: Global Vectors for Word Representation] | ||
− | * | + | *1/18 University Holiday: Martin Luther King Jr. Day |
− | * | + | *1/20 RNNs |
** [http://www.fit.vutbr.cz/research/groups/speech/publi/2010/mikolov_interspeech2010_IS100722.pdf Recurrent neural network based language model] | ** [http://www.fit.vutbr.cz/research/groups/speech/publi/2010/mikolov_interspeech2010_IS100722.pdf Recurrent neural network based language model] | ||
− | * | + | *1/25 LSTMs/GRUs |
** [https://arxiv.org/pdf/1802.05365.pdf Deep contextualized word representations] | ** [https://arxiv.org/pdf/1802.05365.pdf Deep contextualized word representations] | ||
− | * | + | *1/27 Sequence-to-sequence models |
− | * | + | *2/1 Convolutional Neural Networks (HW1 due and HW2 out) |
** [https://www.nature.com/articles/s41586-019-1923-7 Improved protein structure prediction using potentials from deep learning] | ** [https://www.nature.com/articles/s41586-019-1923-7 Improved protein structure prediction using potentials from deep learning] | ||
− | * | + | *2/3 Attention mechanisms |
− | * | + | *2/8 Transformer and BERT (Mid-term report due [https://forms.gle/3mLA46FANTDZ5s5FA submission link]) |
− | * | + | *2/10 BERT (continued) |
− | * | + | *2/15 University Holiday: Presidents' Day |
− | * | + | *2/17 Language and vision |
**[https://openreview.net/pdf?id=YicbFdNTTy An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale] | **[https://openreview.net/pdf?id=YicbFdNTTy An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale] | ||
− | * | + | *2/22 Deep Reinforcement Learning 1 (HW2 due: 02/26 Monday 11:59pm) |
− | * | + | *2/24 Deep Reinforcement Learning 2 |
** [https://openreview.net/pdf?id=S1g2skStPB Causal Discovery with Reinforcement Learning] | ** [https://openreview.net/pdf?id=S1g2skStPB Causal Discovery with Reinforcement Learning] | ||
** [https://www.nature.com/articles/s41586-019-1724-z Grandmaster level in StarCraft II using multi-agent reinforcement learning] | ** [https://www.nature.com/articles/s41586-019-1724-z Grandmaster level in StarCraft II using multi-agent reinforcement learning] | ||
** [https://www.nature.com/articles/s41586-020-03051-4 Mastering Atari, Go, chess and shogi by planning with a learned model] | ** [https://www.nature.com/articles/s41586-020-03051-4 Mastering Atari, Go, chess and shogi by planning with a learned model] | ||
− | * | + | *3/1 Generative Adversarial Networks |
** [https://arxiv.org/abs/1701.04862 Towards Principled Methods for Training Generative Adversarial Networks] | ** [https://arxiv.org/abs/1701.04862 Towards Principled Methods for Training Generative Adversarial Networks] | ||
** [https://arxiv.org/abs/1701.07875 Wasserstein GAN] | ** [https://arxiv.org/abs/1701.07875 Wasserstein GAN] | ||
** [https://arxiv.org/abs/1703.10717 Boundary Equilibrium GAN] | ** [https://arxiv.org/abs/1703.10717 Boundary Equilibrium GAN] | ||
− | * | + | *3/3 Project: final presentation (1) |
− | * | + | *3/8 Project: final presentation (2) |
− | * | + | *3/10 Project: final presentation (3) |
− | * | + | *3/19 23:59PM PT Project Final Report Due [https://forms.gle/kgN8n8XDz83NWdxo9 submission link]. |
Revision as of 01:50, 1 January 2021
- 1/4 Introduction, logistics, and deep learning.
- 1/6 Tips for a successful class project
- 1/11 Neural network basics, & backpropagation
- 1/13 Word embeddings (Project proposal due submission link, HW1 out)
- 1/18 University Holiday: Martin Luther King Jr. Day
- 1/20 RNNs
- 1/25 LSTMs/GRUs
- 1/27 Sequence-to-sequence models
- 2/1 Convolutional Neural Networks (HW1 due and HW2 out)
- 2/3 Attention mechanisms
- 2/8 Transformer and BERT (Mid-term report due submission link)
- 2/10 BERT (continued)
- 2/15 University Holiday: Presidents' Day
- 2/17 Language and vision
- 2/22 Deep Reinforcement Learning 1 (HW2 due: 02/26 Monday 11:59pm)
- 2/24 Deep Reinforcement Learning 2
- 3/1 Generative Adversarial Networks
- 3/3 Project: final presentation (1)
- 3/8 Project: final presentation (2)
- 3/10 Project: final presentation (3)
- 3/19 23:59PM PT Project Final Report Due submission link.