Difference between revisions of "Spring 2017 CS292F Syllabus"

From courses
Jump to: navigation, search
Line 1: Line 1:
 
*04/04 Introduction, logistics, NLP, and deep learning.
 
*04/04 Introduction, logistics, NLP, and deep learning.
 
*04/06 Tips for a successful class project
 
*04/06 Tips for a successful class project
*04/11 Word embeddings (HW1 out)
+
*04/11 Word embeddings  
 
** [https://ronan.collobert.com/pub/matos/2008_nlp_icml.pdf A Unified Architecture for Natural Language Processing: Deep Neural Networks with Multitask Learning,  Collobert and Weston, ICML 2008]
 
** [https://ronan.collobert.com/pub/matos/2008_nlp_icml.pdf A Unified Architecture for Natural Language Processing: Deep Neural Networks with Multitask Learning,  Collobert and Weston, ICML 2008]
 
** [http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf Distributed Representations of Words and Phrases and their Compositionality, T Mikolov, I Sutskever, K Chen, GS Corrado, J Dean, NIPS 2013]
 
** [http://papers.nips.cc/paper/5021-distributed-representations-of-words-and-phrases-and-their-compositionality.pdf Distributed Representations of Words and Phrases and their Compositionality, T Mikolov, I Sutskever, K Chen, GS Corrado, J Dean, NIPS 2013]
Line 9: Line 9:
 
** [http://papers.nips.cc/paper/5071-translating-embeddings-for-modeling-multi-relational-data.pdf Translating embeddings for modeling multi-relational data, A Bordes, N Usunier, A Garcia-Duran, NIPS 2013]
 
** [http://papers.nips.cc/paper/5071-translating-embeddings-for-modeling-multi-relational-data.pdf Translating embeddings for modeling multi-relational data, A Bordes, N Usunier, A Garcia-Duran, NIPS 2013]
 
** [https://arxiv.org/pdf/1503.00759.pdf A Review of Relational Machine Learning for Knowledge Graphs, Nichel et al., Proceedings of the IEEE]
 
** [https://arxiv.org/pdf/1503.00759.pdf A Review of Relational Machine Learning for Knowledge Graphs, Nichel et al., Proceedings of the IEEE]
*04/18 Neural network basics (Project proposal due)
+
*04/18 Neural network basics (Project proposal due, HW1 out)
 
*04/20 Neural networks language models  
 
*04/20 Neural networks language models  
 
** [http://www.jmlr.org/papers/volume3/bengio03a/bengio03a.pdf A Neural Probabilistic Language Model, Bengio, JMLR 2003]
 
** [http://www.jmlr.org/papers/volume3/bengio03a/bengio03a.pdf A Neural Probabilistic Language Model, Bengio, JMLR 2003]
*04/25 RNNs (HW1 due and HW2 out)
+
*04/25 RNNs  
 
** [http://www.fit.vutbr.cz/research/groups/speech/publi/2010/mikolov_interspeech2010_IS100722.pdf Recurrent neural network based language model]  
 
** [http://www.fit.vutbr.cz/research/groups/speech/publi/2010/mikolov_interspeech2010_IS100722.pdf Recurrent neural network based language model]  
 
** [https://arxiv.org/pdf/1308.0850.pdf Generating Sequences With Recurrent Neural Networks, Alex Graves, 2013 arxiv]
 
** [https://arxiv.org/pdf/1308.0850.pdf Generating Sequences With Recurrent Neural Networks, Alex Graves, 2013 arxiv]
Line 18: Line 18:
 
** [http://www.bioinf.jku.at/publications/older/2604.pdf Long short term memory, S. Hochreiter and J. Schmidhuber, Neural Computation, 1997]
 
** [http://www.bioinf.jku.at/publications/older/2604.pdf Long short term memory, S. Hochreiter and J. Schmidhuber, Neural Computation, 1997]
 
** [https://arxiv.org/pdf/1409.1259.pdf On the Properties of Neural Machine Translation: Encoder–Decoder Approaches, Cho et al., 2014]
 
** [https://arxiv.org/pdf/1409.1259.pdf On the Properties of Neural Machine Translation: Encoder–Decoder Approaches, Cho et al., 2014]
*05/02 Sequence-to-sequence models and neural machine translation
+
*05/02 Sequence-to-sequence models and neural machine translation (HW1 due and HW2 out)
 
** [https://arxiv.org/pdf/1406.1078.pdf Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation, Cho et al., EMNLP 2014]
 
** [https://arxiv.org/pdf/1406.1078.pdf Learning Phrase Representations using RNN Encoder–Decoder for Statistical Machine Translation, Cho et al., EMNLP 2014]
 
** [https://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf Sequence to Sequence Learning with Neural Networks, Sutskever et al., NIPS 2014]
 
** [https://papers.nips.cc/paper/5346-sequence-to-sequence-learning-with-neural-networks.pdf Sequence to Sequence Learning with Neural Networks, Sutskever et al., NIPS 2014]
Line 25: Line 25:
 
** [https://arxiv.org/abs/1506.03340 Teaching Machines to Read and Comprehend, NIPS 2015]
 
** [https://arxiv.org/abs/1506.03340 Teaching Machines to Read and Comprehend, NIPS 2015]
 
*05/09 Project: mid-term presentation (1)
 
*05/09 Project: mid-term presentation (1)
*05/11 Project: mid-term presentation (2) (HW2 due)
+
*05/11 Project: mid-term presentation (2)
*05/16 Convolutional Neural Networks
+
*05/16 Convolutional Neural Networks (HW2 due)
 
** [http://ronan.collobert.com/pub/matos/2011_nlp_jmlr.pdf Natural Language Processing (Almost) from Scratch, Collobert et al., JMLR 2011]
 
** [http://ronan.collobert.com/pub/matos/2011_nlp_jmlr.pdf Natural Language Processing (Almost) from Scratch, Collobert et al., JMLR 2011]
 
** [http://emnlp2014.org/papers/pdf/EMNLP2014181.pdf Convolutional Neural Networks for Sentence Classification, Yoon Kim, EMNLP 2014]
 
** [http://emnlp2014.org/papers/pdf/EMNLP2014181.pdf Convolutional Neural Networks for Sentence Classification, Yoon Kim, EMNLP 2014]

Revision as of 17:39, 3 April 2017