This is the official implementation of WSN in the paper in Pytorch.
- PyTorch > 1.5
- Permuted MNIST (available current version)
- 5 Datasets (available current version)
- Omniglot Rotation (available current version)
- CIFAR-100 Split (available current version)
- CIFAR-100 Superclass (available current version)
- TinyImageNet (available current version)
To execute the codes for running experiments, run the following.
pip install -r requirements.txt
We provide several training examples with this repositories:
- To train WSN on Permuted MNIST on GPU [GPU_ID] with seed number [SEED] and sparsity [SPARSITY], simply run the following
>> ./scripts/wsn/wsn_pmnist.sh [GPU_ID] [SEED] [SPARSITY]
- To train WSN on Cifar100-100 on GPU [GPU_ID] with seed number [SEED] and sparsity [SPARSITY], simply run the following
>> ./scripts/wsn/wsn_cifar100_100.sh [GPU_ID] [SEED] [SPARSITY]
- To train WSN + FSO on Cifar100-100 on GPU [GPU_ID] with seed number [SEED] and sparsity [SPARSITY], simply run the following
>> update soon
Haeyong Kang, Rusty John Lloyd Mina, Sultan Rizky Hikmawan Madjid,
Jaehong Yoon, Mark Hasegawa-Johnson, Sung Ju Hwang, Chang D Yoo.,
Forget-free Continual Learning with Winning Subnetworks-ICML2022