Skip to content

Latest commit

 

History

History

sipp

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 

SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks

Lucas Liebenwein*, Cenk Baykal*, Igor Gilitschenski, Dan Feldman, Daniela Rus

Implementation of provable pruning using sensitivity as introduced in SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks (weight pruning).

*Equal contribution

Method

Sensitivity of a weight

The algorithm relies on a novel notion of weight sensitivity as saliency score for weight parameters in the network to estimate their relative importance. In the simple case of a linear layer the sensitivity of a single weight w_ij in layer l can be defined as the maximum relative contribution of the weight to the corresponding output neuron over a small set of points x \in S:

The weight hereby represents the edge connecting neuron j in layer ell-1 to neuron i in layer l. This notion can then be generalized to convolutional layers, neurons, and filters among others as is shown in the respective papers.

In the paper, we show how pruning according to (empirical) sensitivity enables us to provably quantify the trade-off between the error and sparsity of the resulting pruned neural network.

Setup

Check out the main README.md and the respective packages for more information on the code base.

Run experiments

The experiment configurations are located here. To reproduce the experiments for a specific configuration, run:

python -m experiment.main paper/sipp/cifar/cascade/resnet20.yaml

Citations

Please cite our paper when using this codebase.

Paper link

SiPPing Neural Networks: Sensitivity-informed Provable Pruning of Neural Networks

Bibtex

@article{baykal2022sensitivity,
  title={Sensitivity-informed provable pruning of neural networks},
  author={Baykal, Cenk and Liebenwein, Lucas and Gilitschenski, Igor and Feldman, Dan and Rus, Daniela},
  journal={SIAM Journal on Mathematics of Data Science},
  volume={4},
  number={1},
  pages={26--45},
  year={2022},
  publisher={SIAM}
}