2

[2012.14905] Meta Learning Backpropagation And Improving It

 1 year ago
source link: https://arxiv.org/abs/2012.14905
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

[Submitted on 29 Dec 2020 (v1), last revised 13 Mar 2022 (this version, v4)]

Meta Learning Backpropagation And Improving It

Download PDF

Many concepts have been proposed for meta learning with neural networks (NNs), e.g., NNs that learn to reprogram fast weights, Hebbian plasticity, learned learning rules, and meta recurrent NNs. Our Variable Shared Meta Learning (VSML) unifies the above and demonstrates that simple weight-sharing and sparsity in an NN is sufficient to express powerful learning algorithms (LAs) in a reusable fashion. A simple implementation of VSML where the weights of a neural network are replaced by tiny LSTMs allows for implementing the backpropagation LA solely by running in forward-mode. It can even meta learn new LAs that differ from online backpropagation and generalize to datasets outside of the meta training distribution without explicit gradient calculation. Introspection reveals that our meta learned LAs learn through fast association in a way that is qualitatively different from gradient descent.

Comments: Updated to the NeurIPS 2021 camera ready; fixed typo in eq 4
Subjects: Machine Learning (cs.LG); Artificial Intelligence (cs.AI); Neural and Evolutionary Computing (cs.NE); Machine Learning (stat.ML)
Cite as: arXiv:2012.14905 [cs.LG]
  (or arXiv:2012.14905v4 [cs.LG] for this version)
  https://doi.org/10.48550/arXiv.2012.14905
Journal reference: 35th Conference on Neural Information Processing Systems (NeurIPS 2021)

About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK