It is recommended to use a virtual environment (conda).
Using conda:
conda env create --name envname --file=env.yml
Using pip:
pip install -r requirements.txtinst
Check forai.ipynb notbook
db = DataBunch('mnist')
model = get_model('BaseModel')
glogger.info('Training')
trainer = Trainer(model=model, db=db, epochs=1)
trainer.run()
glogger.info(trainer.metrics)
glogger.info('Pruning')
trainer.run_pruning()
glogger.info(trainer.print_metrics())
The configurations and hyper-parameters can be found in the configs package and you can change and adjust them.
- J. Frankle & M. Carbin: The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks
- Pruning Makes Faster and Smaller Neural Networks | Two Minute Papers
- Pruning deep neural networks to make them fast and small
- Neural Network Pruning
- Awesome-Pruning
- Effective TensorFlow 2.0
- This was my first time to learn about Pruning Neural Networks!
- In this project I decided to use Tensorflow 2.0 to learn and have actual project using tf2.
- I tried to follow tensorflow best practices and have a modularize code which can be extended to support more types of pruning and models.
- I followed some of FOR.ai/rl library design ;)
- I tried to implement everything with customization and also used Keras APIs to integrate it with the project.
- I read and watched some useful resources that helped me during the project
- I really enjoyed working on this project! ๐ ๐
Amr M. Kayid |
---|