Skip to content

Commit

Permalink
Updated papers files
Browse files Browse the repository at this point in the history
  • Loading branch information
AndreaCossu committed Jul 11, 2022
1 parent 9ed0f5a commit fda8b03
Show file tree
Hide file tree
Showing 4 changed files with 1,183 additions and 899 deletions.
24 changes: 14 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
<img src="https://github.com/ContinualAI/continual-learning-papers/blob/main/logo.png" alt="ContinualAI logo"/ width="300px" align="center">
</p>

Continual Learning papers list, curated by ContinualAI. **Search among 332 papers!**
Continual Learning papers list, curated by ContinualAI. **Search among 338 papers!**

You can browse the list in this file or interactively on the [ContinualAI website](https://www.continualai.org/papers/).

Expand Down Expand Up @@ -123,14 +123,16 @@ In this section we maintain a list of all applicative papers produced on continu

### Architectural Methods

**33 papers**
**35 papers**

In this section we collect all the papers introducing a continual learning strategy employing some architectural methods.

- [Provable and Efficient Continual Representation Learning](http://arxiv.org/abs/2203.02026) by Yingcong Li, Mingchen Li, M. Salman Asif and Samet Oymak. *arXiv*, 2022.
- [Architecture Matters in Continual Learning](http://arxiv.org/abs/2202.00275) by Seyed Iman Mirzadeh, Arslan Chaudhry, Dong Yin, Timothy Nguyen, Razvan Pascanu, Dilan Gorur and Mehrdad Farajtabar. *arXiv*, 2022.
- [Continual Learning with Node-Importance Based Adaptive Group Sparse Regularization](http://arxiv.org/abs/2003.13726) by Sangwon Jung, Hongjoon Ahn, Sungmin Cha and Taesup Moon. , 2021.
- [Structured Ensembles: An Approach to Reduce the Memory Footprint of Ensemble Methods](https://linkinghub.elsevier.com/retrieve/pii/S0893608021003579) by Jary Pomponi, Simone Scardapane and Aurelio Uncini. *Neural Networks*, 407--418, 2021.
- [Continual Learning via Bit-Level Information Preserving](https://openaccess.thecvf.com/content/CVPR2021/html/Shi_Continual_Learning_via_Bit-Level_Information_Preserving_CVPR_2021_paper.html) by Yujun Shi, Li Yuan, Yunpeng Chen and Jiashi Feng. *Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition*, 16674--16683, 2021.
- [Modular Dynamic Neural Network: A Continual Learning Architecture](https://www.mdpi.com/2076-3417/11/24/12078) by Daniel Turner, Pedro J. S. Cardoso and João M. F. Rodrigues. *Applied Sciences*, 12078, 2021.
- [Continual Learning with Adaptive Weights (CLAW)](https://openreview.net/forum?id=Hklso24Kwr) by Tameem Adel, Han Zhao and Richard E Turner. *International Conference on Learning Representations*, 2020. [cifar] [mnist] [omniglot]
- [Continual Learning with Gated Incremental Memories for Sequential Data Processing](http://arxiv.org/abs/2004.04077) by Andrea Cossu, Antonio Carta and Davide Bacciu. *Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN 2020)*, 2020. [mnist] [rnn]
- [Continual Learning in Recurrent Neural Networks](https://openreview.net/forum?id=8xeBUgD8u9) by Benjamin Ehret, Christian Henning, Maria Cervera, Alexander Meulemans, Johannes Von Oswald and Benjamin F. Grewe. *International Conference on Learning Representations*, 2020. [audio] [rnn]
Expand Down Expand Up @@ -188,8 +190,8 @@ In this section we list all the papers related to bioinspired continual learning
- [A Biologically Plausible Audio-Visual Integration Model for Continual Learning](http://arxiv.org/abs/2007.08855) by Wenjie Chen, Fengtong Du, Ye Wang and Lihong Cao. *IJCNN*, 2021.
- [Synaptic Metaplasticity in Binarized Neural Networks](https://www.nature.com/articles/s41467-021-22768-y) by Axel Laborieux, Maxence Ernoult, Tifenn Hirtzlin and Damien Querlioz. *Nature Communications*, 2549, 2021.
- [Controlled Forgetting: Targeted Stimulation and Dopaminergic Plasticity Modulation for Unsupervised Lifelong Learning in Spiking Neural Networks](https://www.frontiersin.org/article/10.3389/fnins.2020.00007/full) by Jason M. Allred and Kaushik Roy. *Frontiers in Neuroscience*, 7, 2020. [spiking]
- [Cognitively-Inspired Model for Incremental Learning Using a Few Examples](https://openaccess.thecvf.com/content_CVPRW_2020/html/w15/Ayub_Cognitively-Inspired_Model_for_Incremental_Learning_Using_a_Few_Examples_CVPRW_2020_paper.html) by A. Ayub and A. R. Wagner. *Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops*, 2020. [cifar] [cubs] [dual]
- [Storing Encoded Episodes as Concepts for Continual Learning](https://arxiv.org/abs/2007.06637 http://arxiv.org/abs/2007.06637) by Ali Ayub and Alan R. Wagner. *arXiv*, 2020. [generative] [imagenet] [mnist]
- [Cognitively-Inspired Model for Incremental Learning Using a Few Examples](https://openaccess.thecvf.com/content_CVPRW_2020/html/w15/Ayub_Cognitively-Inspired_Model_for_Incremental_Learning_Using_a_Few_Examples_CVPRW_2020_paper.html) by A. Ayub and A. R. Wagner. *Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops*, 2020. [cifar] [cubs] [dual]
- [Spiking Neural Predictive Coding for Continual Learning from Data Streams](http://arxiv.org/abs/1908.08655) by and Alexander Ororbia. *arXiv*, 2020. [spiking]
- [Brain-like Replay for Continual Learning with Artificial Neural Networks](https://baicsworkshop.github.io/pdf/BAICS_8.pdf) by Gido M. van de Ven, Hava T. Siegelmann and Andreas S. Tolias. *International Conference on Learning Representations (Workshop on Bridging AI and Cognitive Science)*, 2020. [cifar]
- [Selfless Sequential Learning](https://openreview.net/forum?id=Bkxbrn0cYX) by Rahaf Aljundi, Marcus Rohrbach and Tinne Tuytelaars. *ICLR*, 2019. [cifar] [mnist] [sparsity]
Expand Down Expand Up @@ -218,8 +220,8 @@ In this section we list all the major contributions trying to understand catastr

- [Architecture Matters in Continual Learning](http://arxiv.org/abs/2202.00275) by Seyed Iman Mirzadeh, Arslan Chaudhry, Dong Yin, Timothy Nguyen, Razvan Pascanu, Dilan Gorur and Mehrdad Farajtabar. *arXiv*, 2022.
- [Continual Learning in the Teacher-Student Setup: Impact of Task Similarity](http://proceedings.mlr.press/v139/lee21e.html) by Sebastian Lee, Sebastian Goldt and Andrew Saxe. *International Conference on Machine Learning*, 6109--6119, 2021.
- [Understanding Continual Learning Settings with Data Distribution Drift Analysis](http://arxiv.org/abs/2104.01678) by Timothée Lesort, Massimo Caccia and Irina Rish. *arXiv*, 2021.
- [Continual Learning in Deep Networks: An Analysis of the Last Layer](http://arxiv.org/abs/2106.01834) by Timothée Lesort, Thomas George and Irina Rish. *arXiv*, 2021.
- [Understanding Continual Learning Settings with Data Distribution Drift Analysis](http://arxiv.org/abs/2104.01678) by Timothée Lesort, Massimo Caccia and Irina Rish. *arXiv*, 2021.
- [Wide Neural Networks Forget Less Catastrophically](http://arxiv.org/abs/2110.11526) by Seyed Iman Mirzadeh, Arslan Chaudhry, Huiyi Hu, Razvan Pascanu, Dilan Gorur and Mehrdad Farajtabar. *arXiv*, 2021.
- [Anatomy of Catastrophic Forgetting: Hidden Representations and Task Semantics](https://openreview.net/forum?id=LhY8QdUGSuw) by Vinay Venkatesh Ramasesh, Ethan Dyer and Maithra Raghu. *International Conference on Learning Representations*, 2021.
- [Does Continual Learning = Catastrophic Forgetting?](http://arxiv.org/abs/2101.07295) by Anh Thai, Stefan Stojanov, Isaac Rehg and James M. Rehg. *arXiv*, 2021.
Expand Down Expand Up @@ -445,12 +447,12 @@ In this section we list all the other papers not appearing in at least one of th
- [Long Live the Lottery: The Existence of Winning Tickets in Lifelong Learning](https://openreview.net/forum?id=LXMSvPmsm0g) by Tianlong Chen, Zhenyu Zhang, Sijia Liu, Shiyu Chang and Zhangyang Wang. *International Conference on Learning Representations*, 2020.
- [Lifelong Machine Learning with Deep Streaming Linear Discriminant Analysis](http://arxiv.org/abs/1909.01520) by Tyler L Hayes and Christopher Kanan. *CLVision Workshop at CVPR 2020*, 1--15, 2020. [core50] [imagenet]
- [Continual Learning with Bayesian Neural Networks for Non-Stationary Data](https://iclr.cc/virtual_2020/poster_SJlsFpVtDB.html) by Richard Kurle, Botond Cseke, Alexej Klushyn, Patrick van der Smagt and Stephan Günnemann. *Eighth International Conference on Learning Representations*, 2020. [bayes]
- [Continual Learning Using Task Conditional Neural Networks](http://arxiv.org/abs/2005.05080) by Honglin Li, Payam Barnaghi, Shirin Enshaeifar and Frieder Ganz. *arXiv*, 2020. [cifar] [mnist]
- [Energy-Based Models for Continual Learning](http://arxiv.org/abs/2011.12216) by Shuang Li, Yilun Du, Gido M. van de Ven, Antonio Torralba and Igor Mordatch. *arXiv*, 2020. [cifar] [experimental] [mnist]
- [Continual Universal Object Detection](http://arxiv.org/abs/2002.05347) by Xialei Liu, Hao Yang, Avinash Ravichandran, Rahul Bhotika and Stefano Soatto. *arXiv*, 2020.
- [Continual Learning Using Task Conditional Neural Networks](http://arxiv.org/abs/2005.05080) by Honglin Li, Payam Barnaghi, Shirin Enshaeifar and Frieder Ganz. *arXiv*, 2020. [cifar] [mnist]
- [Mnemonics Training: Multi-Class Incremental Learning without Forgetting](http://arxiv.org/abs/2002.10211) by Yaoyao Liu, An-An Liu, Yuting Su, Bernt Schiele and Qianru Sun. *arXiv*, 2020. [cifar] [imagenet]
- [Structured Compression and Sharing of Representational Space for Continual Learning](http://arxiv.org/abs/2001.08650) by Gobinda Saha, Isha Garg, Aayush Ankit and Kaushik Roy. *arXiv*, 2020. [cifar] [mnist]
- [Continual Universal Object Detection](http://arxiv.org/abs/2002.05347) by Xialei Liu, Hao Yang, Avinash Ravichandran, Rahul Bhotika and Stefano Soatto. *arXiv*, 2020.
- [Gradient Projection Memory for Continual Learning](https://openreview.net/forum?id=3AOj0RCNC2) by Gobinda Saha and Kaushik Roy. *International Conference on Learning Representations*, 2020.
- [Structured Compression and Sharing of Representational Space for Continual Learning](http://arxiv.org/abs/2001.08650) by Gobinda Saha, Isha Garg, Aayush Ankit and Kaushik Roy. *arXiv*, 2020. [cifar] [mnist]
- [Gated Linear Networks](http://arxiv.org/abs/1910.01526) by Joel Veness, Tor Lattimore, David Budden, Avishkar Bhoopchand, Christopher Mattern, Agnieszka Grabska-Barwinska, Eren Sezener, Jianan Wang, Peter Toth, Simon Schmitt and Marcus Hutter. *arXiv*, 2020.
- [Lifelong Graph Learning](http://arxiv.org/abs/2009.00647) by Chen Wang, Yuheng Qiu and Sebastian Scherer. *arXiv*, 2020. [graph]
- [Superposition of Many Models into One](http://arxiv.org/abs/1902.05522) by Brian Cheung, Alex Terekhov, Yubei Chen, Pulkit Agrawal and Bruno Olshausen. *arXiv*, 2019. [cifar] [mnist]
Expand Down Expand Up @@ -480,13 +482,14 @@ In this section we list all the other papers not appearing in at least one of th

### Regularization Methods

**28 papers**
**29 papers**

In this section we collect all the papers introducing a continual learning strategy employing some regularization methods.

- [Using Hindsight to Anchor Past Knowledge in Continual Learning](http://arxiv.org/abs/2002.08165) by Arslan Chaudhry, Albert Gordo, Puneet K. Dokania, Philip Torr and David Lopez-Paz. *arXiv*, 2021.
- [Contrastive Continual Learning with Feature Propagation](http://arxiv.org/abs/2112.01713) by Xuejun Han and Yuhong Guo. *arXiv:2112.01713 [cs]*, 2021.
- [Gradient Projection Memory for Continual Learning](http://arxiv.org/abs/2103.09762) by Gobinda Saha, Isha Garg and Kaushik Roy. *arXiv:2103.09762 [cs]*, 2021.
- [Gradient Projection Memory for Continual Learning](http://arxiv.org/abs/2103.09762) by Gobinda Saha, Isha Garg and Kaushik Roy. , 2021.
- [Modeling the Background for Incremental Learning in Semantic Segmentation](http://arxiv.org/abs/2002.00718) by Fabio Cermelli, Massimiliano Mancini, Samuel Rota Bulò, Elisa Ricci and Barbara Caputo. *CVPR*, 9233--9242, 2020.
- [PLOP: Learning without Forgetting for Continual Semantic Segmentation](https://arxiv.org/abs/2011.11390) by Arthur Douillard, Yifu Chen, Arnaud Dapogny and Matthieu Cord. *arXiv*, 2020.
- [Insights from the Future for Continual Learning](https://arxiv.org/abs/2006.13748) by Arthur Douillard, Eduardo Valle, Charles Ollion, Thomas Robert and Matthieu Cord. *arXiv*, 2020.
Expand Down Expand Up @@ -515,10 +518,11 @@ In this section we collect all the papers introducing a continual learning strat

### Rehearsal Methods

**28 papers**
**29 papers**

In this section we collect all the papers introducing a continual learning strategy employing some rehearsal methods.

- [It's All About Consistency: A Study on Memory Composition for Replay-Based Methods in Continual Learning](http://arxiv.org/abs/2207.01145) by Julio Hurtado, Alain Raymond-Saez, Vladimir Araujo, Vincenzo Lomonaco and Davide Bacciu. , 2022.
- [Foundational Models for Continual Learning: An Empirical Study of Latent Replay](http://arxiv.org/abs/2205.00329) by Oleksiy Ostapenko, Timothee Lesort, Pau Rodríguez, Md Rifat Arefin, Arthur Douillard, Irina Rish and Laurent Charlin. *arXiv*, 2022.
- [Using Hindsight to Anchor Past Knowledge in Continual Learning](http://arxiv.org/abs/2002.08165) by Arslan Chaudhry, Albert Gordo, Puneet K. Dokania, Philip Torr and David Lopez-Paz. *arXiv*, 2021.
- [Continual Prototype Evolution: Learning Online from Non-Stationary Data Streams](https://openaccess.thecvf.com/content/ICCV2021/html/De_Lange_Continual_Prototype_Evolution_Learning_Online_From_Non-Stationary_Data_Streams_ICCV_2021_paper.html) by Matthias De Lange and Tinne Tuytelaars. *Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)*, 8250--8259, 2021. [cifar] [framework] [mnist] [vision]
Expand All @@ -536,8 +540,8 @@ In this section we collect all the papers introducing a continual learning strat
- [Graph-Based Continual Learning](https://openreview.net/forum?id=HHSEKOnPvaO) by Binh Tang and David S. Matteson. *International Conference on Learning Representations*, 2020.
- [Brain-Inspired Replay for Continual Learning with Artificial Neural Networks](https://www.nature.com/articles/s41467-020-17866-2) by Gido M. van de Ven, Hava T. Siegelmann and Andreas S. Tolias. *Nature Communications*, 2020. [cifar] [framework] [generative] [mnist]
- [Continual Learning with Hypernetworks](https://openreview.net/forum?id=SJgwNerKvB) by Johannes von Oswald, Christian Henning, João Sacramento and Benjamin F Grewe. *International Conference on Learning Representations*, 2020. [cifar] [mnist]
- [Gradient Based Sample Selection for Online Continual Learning](http://papers.nips.cc/paper/9354-gradient-based-sample-selection-for-online-continual-learning.pdf) by Rahaf Aljundi, Min Lin, Baptiste Goujaud and Yoshua Bengio. *Advances in Neural Information Processing Systems 32*, 11816--11825, 2019. [cifar] [mnist]
- [Online Continual Learning with Maximal Interfered Retrieval](http://papers.nips.cc/paper/9357-online-continual-learning-with-maximal-interfered-retrieval.pdf) by Rahaf Aljundi, Eugene Belilovsky, Tinne Tuytelaars, Laurent Charlin, Massimo Caccia, Min Lin and Lucas Page-Caccia. *Advances in Neural Information Processing Systems 32*, 11849--11860, 2019. [cifar] [mnist]
- [Gradient Based Sample Selection for Online Continual Learning](http://papers.nips.cc/paper/9354-gradient-based-sample-selection-for-online-continual-learning.pdf) by Rahaf Aljundi, Min Lin, Baptiste Goujaud and Yoshua Bengio. *Advances in Neural Information Processing Systems 32*, 11816--11825, 2019. [cifar] [mnist]
- [IL2M: Class Incremental Learning With Dual Memory](https://doi.org/10.1109/ICCV.2019.00067) by Eden Belouadah and Adrian Popescu. *2019 IEEE/CVF International Conference on Computer Vision, ICCV 2019, Seoul, Korea (South), October 27 - November 2, 2019*, 583--592, 2019.
- [On Tiny Episodic Memories in Continual Learning](https://github.com/facebookresearch/agem http://arxiv.org/abs/1902.10486) by Arslan Chaudhry, Marcus Rohrbach, Mohamed Elhoseiny, Thalaiyasingam Ajanthan, Puneet K Dokania, Philip H S Torr and Marc'Aurelio Ranzato. *arXiv*, 2019. [cifar] [imagenet] [mnist]
- [Facilitating Bayesian Continual Learning by Natural Gradients and Stein Gradients](http://arxiv.org/abs/1904.10644) by Yu Chen, Tom Diethe and Neil Lawrence. *arXiv*, 2019. [bayes]
Expand Down
2 changes: 1 addition & 1 deletion embedding-plot.html

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion embedding.json

Large diffs are not rendered by default.

Loading

0 comments on commit fda8b03

Please sign in to comment.