Difference between revisions of "Spring 2017 CS292F Syllabus"
From courses
Line 19: | Line 19: | ||
** [http://www.bioinf.jku.at/publications/older/2604.pdf Long short term memory, S. Hochreiter and J. Schmidhuber, Neural Computation, 1997] | ** [http://www.bioinf.jku.at/publications/older/2604.pdf Long short term memory, S. Hochreiter and J. Schmidhuber, Neural Computation, 1997] | ||
** [https://arxiv.org/pdf/1409.1259.pdf On the Properties of Neural Machine Translation: Encoder–Decoder Approaches, Cho et al., 2014] | ** [https://arxiv.org/pdf/1409.1259.pdf On the Properties of Neural Machine Translation: Encoder–Decoder Approaches, Cho et al., 2014] | ||
+ | ** [https://arxiv.org/pdf/1502.02367v3.pdf Gated Feedback Recurrent Neural Networks, Chung et al., ICML 2015] | ||
*05/02 Sequence-to-sequence models and neural machine translation (HW1 due and HW2 out) | *05/02 Sequence-to-sequence models and neural machine translation (HW1 due and HW2 out) | ||
** [https://arxiv.org/pdf/1406.1078.pdf Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation, Cho et al., EMNLP 2014] | ** [https://arxiv.org/pdf/1406.1078.pdf Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation, Cho et al., EMNLP 2014] |
Revision as of 12:59, 5 April 2017
- 04/04 Introduction, logistics, NLP, and deep learning.
- 04/06 Tips for a successful class project
- 04/11 Word embeddings
- 04/13 Knowledge base embeddings
- A three-way model for collective learning on multi-relational data, M Nickel, V Tresp, HP Kriegel, ICML 2011
- Translating embeddings for modeling multi-relational data, A Bordes, N Usunier, A Garcia-Duran, NIPS 2013
- A Review of Relational Machine Learning for Knowledge Graphs, Nichel et al., Proceedings of the IEEE
- 04/18 Neural network basics (Project proposal due, HW1 out)
- 04/20 Recursive Neural Networks
- 04/25 RNNs (NLP seminar: Stanford NLP's Jiwei Li 04/26)
- 04/27 LSTMs/GRUs
- 05/02 Sequence-to-sequence models and neural machine translation (HW1 due and HW2 out)
- 05/04 Attention mechanisms
- 05/09 Project: mid-term presentation (1)
- 05/11 Project: mid-term presentation (2)
- 05/16 Convolutional Neural Networks (HW2 due)
- 05/18 Language and vision
- 05/23 Deep Reinforcement Learning 1
- 05/25 Deep Reinforcement Learning 2
- 05/30 Unsupervised Learning
- 06/01 Project: final presentation (1)
- 06/06 Project: final presentation (2)
- 06/08 Project: final presentation (3)
- 06/10 23:59PM PT Project Final Report Due.