This repository contains a Pytorch implementation of the paper The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks by Jonathan Frankle and Michael Carbin that can be easily adapted to any model/dataset.
pip3 install -r requirements.txt
python3 main.py --prune_type=lt --arch_type=fc1 --dataset=mnist --prune_percent=10 --prune_iterations=35
--prune_type: Type of pruning- Options :
lt- Lottery Ticket Hypothesis,reinit- Random reinitialization - Default :
lt
- Options :
--arch_type: Type of architecture- Options :
fc1- Simple fully connected network,lenet5- LeNet5,AlexNet- AlexNet,resnet18- Resnet18,vgg16- VGG16 - Default :
fc1
- Options :
--dataset: Choice of dataset- Options :
mnist,fashionmnist,cifar10,cifar100 - Default :
mnist
- Options :
--prune_percent: Percentage of weight to be pruned after each cycle.- Default :
10
- Default :
--prune_iterations: Number of cycle of pruning that should be done.- Default :
35
- Default :
--lr: Learning rate- Default :
1.2e-3
- Default :
--batch_size: Batch size- Default :
60
- Default :
--end_iter: Number of Epochs- Default :
100
- Default :
--print_freq: Frequency for printing accuracy and loss- Default :
1
- Default :
--valid_freq: Frequency for Validation- Default :
1
- Default :
--gpu: Decide Which GPU the program should use- Default :
0
- Default :