update table
This is the official release of “DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization” ECCV 2022.
Our method can achieve faster distillation training on ImageNet1K
pip install -r requirements.txt
python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 40
python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 240 \ --alpha_normalization_style 333
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.
Apache-2.0
@inproceedings{deng2022distpro, title={DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization}, author={Xueqing Deng and Dawei Sun and Shawn Newsam and Peng Wang}, journal={ECCV}, year={2022} }
版权所有:中国计算机学会技术支持:开源发展技术委员会 京ICP备13000930号-9 京公网安备 11010802032778号
DistPro ECCV 2022
This is the official release of “DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization” ECCV 2022.
Our method can achieve faster distillation training on ImageNet1K
Installation
Usage
1. Search distillation process on CIFAR datasets with sample epochs.
2. Retrain the data with full epochs
Contributing
Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.
Please make sure to update tests as appropriate.
License
Apache-2.0