Skip to content

Implementing Network Pruning ( Weight and Unit Pruning ) from Scratch giving different percent sparcity to the network.

License

Notifications You must be signed in to change notification settings

RunChengNi/Tensorflow-Network-Pruning-with-Scratch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Tensorflow-Network-Pruning-with-Scratch

Implementing Network Pruning ( Weight and Unit Pruning ) from Scratch giving different percent sparcity to the network.

Training of weights is done on MNIST digit dataset. Activation function Relu is used withing the layers. Hidden Layers of 1000 1000 500 200 are used. Weight and Unit pruning was iteratively done with percent pruning of [0, 25, 50, 60, 70, 80, 90, 95, 97, 99]

About

Implementing Network Pruning ( Weight and Unit Pruning ) from Scratch giving different percent sparcity to the network.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%