Lifelong learning is challenging for deep neural networks due to their susceptibility to catastrophic forgetting. Catastrophic forgetting occurs when a trained network is not able to maintain its ability to accomplish previously learned tasks when it is trained to perform new tasks. We study the problem of lifelong learning for generative models, extending a trained network to new conditional generation tasks without forgetting previous tasks, while assuming access to the training data for the current task only. In contrast to state-of-the-art memory replay based approaches which are limited to label-conditioned image generation tasks, a more generic framework for continual learning of generative models under different conditional image generation settings is proposed in this paper. Lifelong GAN employs knowledge distillation to transfer learned knowledge from previous networks to the new network. This makes it possible to perform image-conditioned generation tasks in a lifelong learning setting. We validate Lifelong GAN for both image-conditioned and label-conditioned generation tasks, and provide qualitative and quantitative results to show the generality and effectiveness of our method.
Bibtex
@inproceedings{ZhaiCTHNM19,
title = {Lifelong GAN: Continual Learning for Conditional Image Generation},
author = {Mengyao Zhai and Lei Chen and Fred Tung and Jiawei He and Megha Nawhal and Greg Mori},
booktitle = {International Conference on Computer Vision (ICCV)},
year = 2019,
}
Related Research
-
Self-supervised Learning in Time-Series Forecasting — A Contrastive Learning Approach
Self-supervised Learning in Time-Series Forecasting — A Contrastive Learning Approach
T. Sylvain, L. Meng, and A. Lehrmann.
Research
-
Efficient CDF Approximations for Normalizing Flows
Efficient CDF Approximations for Normalizing Flows
C.S. Sastry, A. Lehrmann, M. Brubaker, and A. Radovic. Transactions on Machine Learning Research (TMLR)
Publications
-
Why Exposure Bias Matters: An Imitation Learning Perspective of Error Accumulation in Language Generation
Why Exposure Bias Matters: An Imitation Learning Perspective of Error Accumulation in Language Generation
K. Arora, L. El Asri, H. Bahuleyan, and J. Chi Kit Cheung. Association for Computational Linguistics (ACL)
Publications