Optimization algorithms for Recurrent neural networks
Recurrent neural networks (RNNs) are know to be extremely powerful yet effectively impossible to train with standard first order methods when the training sequences exhibit long term dependencies. The aim of this thesis is to develop a novel optimization algorithm which employs adapted first order information to train such networks.
Candidate: Giulio Galvan
Graduated April 2016