Difference between revisions of "Winter 2018 CS291A Syllabus"
From courses
(Created page with "*04/04 Introduction, logistics, NLP, and deep learning. *04/06 Tips for a successful class project *04/11 NLP Tasks *04/13 Word embeddings ** : [https://people.cs.umass.edu/~...") |
|||
Line 12: | Line 12: | ||
** : [https://nlp.stanford.edu/pubs/SocherBauerManningNg_ACL2013.pdf Parsing with Compositional Vector Grammars, Socher et al., ACL 2013] | ** : [https://nlp.stanford.edu/pubs/SocherBauerManningNg_ACL2013.pdf Parsing with Compositional Vector Grammars, Socher et al., ACL 2013] | ||
** : [https://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank, Socher et al., EMNLP 2013] | ** : [https://nlp.stanford.edu/~socherr/EMNLP2013_RNTN.pdf Recursive Deep Models for Semantic Compositionality Over a Sentiment Treebank, Socher et al., EMNLP 2013] | ||
− | *04/25 RNNs | + | *04/25 RNNs |
** [http://www.fit.vutbr.cz/research/groups/speech/publi/2010/mikolov_interspeech2010_IS100722.pdf Recurrent neural network based language model] | ** [http://www.fit.vutbr.cz/research/groups/speech/publi/2010/mikolov_interspeech2010_IS100722.pdf Recurrent neural network based language model] | ||
** : [https://arxiv.org/pdf/1308.0850.pdf Generating Sequences With Recurrent Neural Networks, Alex Graves, 2013 arxiv] | ** : [https://arxiv.org/pdf/1308.0850.pdf Generating Sequences With Recurrent Neural Networks, Alex Graves, 2013 arxiv] |
Revision as of 19:11, 15 January 2018
- 04/04 Introduction, logistics, NLP, and deep learning.
- 04/06 Tips for a successful class project
- 04/11 NLP Tasks
- 04/13 Word embeddings
- : Efficient Non-parametric Estimation of Multiple Embeddings per Word in Vector Space, Neelakantan et al., EMNLP 2014
- : Glove: Global Vectors for Word Representation, J Pennington, R Socher, CD Manning - EMNLP, 2014
- : AutoExtend: Extending Word Embeddings to Embeddings for Synsets and Lexemes, Rothe and Schutze, ACL 2015
- 04/18 Neural network basics (Project proposal due, HW1 out)
- 04/20 Recursive Neural Networks
- 04/25 RNNs
- 04/27 LSTMs/GRUs
- 05/02 Sequence-to-sequence models and neural machine translation (HW1 due and HW2 out)
- 05/04 Attention mechanisms
- 05/09 Project: mid-term presentation (1)
- 05/11 Project: mid-term presentation (2)
- 05/16 Convolutional Neural Networks (HW2 due)
- 05/18 Language and vision
- 05/23 Deep Reinforcement Learning 1
- 05/25 Deep Reinforcement Learning 2
- 05/30 Unsupervised Learning
- 06/01 Project: final presentation (1)
- 06/06 Project: final presentation (2)
- 06/08 Project: final presentation (3)
- 06/10 23:59PM PT Project Final Report Due.