Skip to content

Latest commit

 

History

History

benchmarks

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Benchmarks (Beta version)

ResNet-56 / CIFAR-10 / 2.00x

Method Base (%) Pruned (%) $\Delta$ Acc (%) Speed Up
NIPS [1] - - -0.03 1.76x
Geometric [2] 93.59 93.26 -0.33 1.70x
Polar [3] 93.80 93.83 +0.03 1.88x
CP [4] 92.80 91.80 -1.00 2.00x
AMC [5] 92.80 91.90 -0.90 2.00x
HRank [6] 93.26 92.17 -0.09 2.00x
SFP [7] 93.59 93.36 +0.23 2.11x
ResRep [8] 93.71 93.71 +0.00 2.12x
Ours-L1 93.53 92.93 -0.60 2.12x
Ours-BN 93.53 93.29 -0.24 2.12x
Ours-Group 93.53 93.91 +0.38 2.13x

Note 1: $\text{speed up} = \frac{\text{Base MACs}}{\text{Pruned MACs}}$

Note 2: Baseline methods are not implemented in this repo, because they require additional modifications to the standard models and training scripts.

Please refer to run/prune for training logs.

- Pretraining

python main.py --mode pretrain --dataset cifar10 --model resnet56 --lr 0.1 --total-epochs 200 --lr-decay-milestones 120,150,180 

- L1-Norm Pruner

Pruning Filters for Efficient ConvNets

# bash scripts/prune/cifar/l1_norm_pruner.sh
python main.py --mode prune --model resnet56 --batch-size 128 --restore run/cifar10/pretrain/cifar10_resnet56.pth --dataset cifar10  --method l1 --speed-up 2.11 --global-pruning

- BN Pruner

Learning Efficient Convolutional Networks through Network Slimming

# bash scripts/prune/cifar/bn_pruner.sh
python main.py --mode prune --model resnet56 --batch-size 128 --restore run/cifar10/pretrain/cifar10_resnet56.pth --dataset cifar10  --method slim --speed-up 2.11 --global-pruning --reg 1e-5

- Group Pruner

# bash scripts/prune/cifar/group_lasso_pruner.sh
python main.py --mode prune --model resnet56 --batch-size 128 --restore run/cifar10/pretrain/cifar10_resnet56.pth --dataset cifar10  --method group_lasso --speed-up 2.11 --global-pruning --reg 5e-4

ResNet50 / ImageNet / 2.00 GFLOPs

- L1 Pruner

python -m torch.distributed.launch --nproc_per_node=4 --master_port 18119 --use_env main_imagenet.py --model resnet50 --epochs 90 --batch-size 64 --lr-step-size 30 --lr 0.01 --prune --method l1 --pretrained --output-dir run/imagenet/resnet50_sl --target-flops 2.00 --cache-dataset --print-freq 100 --workers 16 --data-path PATH_TO_IMAGENET --output-dir PATH_TO_OUTPUT_DIR # &> output.log

More results will be released soon!

References

[1] Nisp: Pruning networks using neuron impor- tance score propagation.

[2] Filter pruning via geometric median for deep con-volutional neural networks acceleration.

[3] Neuron-level structured pruning using polarization regularizer.

[4] Pruning Filters for Efficient ConvNets.

[5] Amc: Automl for model compression and ac- 933 celeration on mobile devices.

[6] Hrank: Filter pruning using high-rank feature map.

[7] Soft filter pruning for accelerating deep convolutional 929 neural networks

[8] Resrep: Lossless cnn pruning via decoupling remembering and forgetting.