Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
forgetting [2018/08/31 21:48]
admin
forgetting [2019/01/16 12:20] (current)
admin
Line 242: Line 242:
  
  
 +https://​arxiv.org/​abs/​1705.09847 Lifelong Generative Modeling
 +n this work we focus on a lifelong learning approach to generative modeling where we continuously incorporate newly observed distributions into our learnt model. We do so through a student-teacher Variational Autoencoder architecture which allows us to learn and preserve all the distributions seen so far without the need to retain the past data nor the past models. Through the introduction of a novel cross-model regularizer,​ inspired by a Bayesian update rule, the student model leverages the information learnt by the teacher, which acts as a summary of everything seen till now. The regularizer has the additional benefit of reducing the effect of catastrophic interference that appears when we learn over sequences of distributions. We demonstrate its efficacy in learning sequentially observed distributions as well as its ability to learn a common latent representation across a complex transfer learning scenario.
  
 +https://​arxiv.org/​abs/​1804.00218v1 Synthesis of Differentiable Functional Programs for Lifelong Learning
 +
 +https://​arxiv.org/​abs/​1809.02058v1 Memory Replay GANs: learning to generate images from new categories without forgetting
 +
 +We study two methods to prevent forgetting by leveraging these replays, namely joint training with replay and replay alignment. Qualitative and quantitative experimental results in MNIST, SVHN and LSUN datasets show that our memory replay approach can generate competitive images while significantly mitigating the forgetting of previous categories
 +
 +https://​arxiv.org/​abs/​1711.09601 Memory Aware Synapses: Learning what (not) to forget
 +
 +https://​openreview.net/​forum?​id=H1lIzhC9FX Learning to remember: Dynamic Generative Memory for Continual Learning ​
 +
 +https://​openreview.net/​forum?​id=rJgz8sA5F7 HC-Net: Memory-based Incremental Dual-Network System for Continual learning ​
 +
 +https://​openreview.net/​forum?​id=BkloRs0qK7 A comprehensive,​ application-oriented study of catastrophic forgetting in DNNs
 +
 +https://​openreview.net/​forum?​id=ryGvcoA5YX Overcoming Catastrophic Forgetting via Model Adaptation ​
 +
 +http://​proceedings.mlr.press/​v80/​miconi18a.html Differentiable plasticity: training plastic neural networks with backpropagation