Difference between revisions of "Winter 2018 CS291A Syllabus"

From courses
Jump to: navigation, search
Line 9: Line 9:
 
** : [http://www.iro.umontreal.ca/~vincentp/ift3395/lectures/backprop_old.pdf Learning representations by back-propagating errors, Nature, 1986]
 
** : [http://www.iro.umontreal.ca/~vincentp/ift3395/lectures/backprop_old.pdf Learning representations by back-propagating errors, Nature, 1986]
 
** : [https://arxiv.org/abs/1609.04747 An overview of gradient descent optimization algorithms, Sebastian Ruder, Arxiv 2016]
 
** : [https://arxiv.org/abs/1609.04747 An overview of gradient descent optimization algorithms, Sebastian Ruder, Arxiv 2016]
 +
** : [http://jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf Dropout: A simple way to prevent neural networks from overfitting (2014), N. Srivastava et al., JMLR 2014]
 
*02/01 Recursive Neural Networks  
 
*02/01 Recursive Neural Networks  
 
** : [http://www.robotics.stanford.edu/~ang/papers/emnlp12-SemanticCompositionalityRecursiveMatrixVectorSpaces.pdf Semantic Compositionality through Recursive Matrix-Vector Spaces, Socher et al., EMNLP 2012]
 
** : [http://www.robotics.stanford.edu/~ang/papers/emnlp12-SemanticCompositionalityRecursiveMatrixVectorSpaces.pdf Semantic Compositionality through Recursive Matrix-Vector Spaces, Socher et al., EMNLP 2012]

Revision as of 16:27, 16 January 2018