目录

DistPro ECCV 2022

This is the official release of “DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization” ECCV 2022. Alt text

Our method can achieve faster distillation training on ImageNet1K Alt text

Installation

pip install -r requirements.txt

Usage

python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 40

2. Retrain the data with full epochs

python train_with_distillation.py --paper_setting SETTING_CHOICE --epochs 240 \
--alpha_normalization_style 333

Contributing

Pull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.

Please make sure to update tests as appropriate.

License

Apache-2.0

@inproceedings{deng2022distpro,
  title={DistPro: Searching A Fast Knowledge Distillation Process via Meta Optimization},
  author={Xueqing Deng and Dawei Sun and Shawn Newsam and Peng Wang},
  journal={ECCV},
  year={2022}
}
关于
1.2 MB
邀请码
    Gitlink(确实开源)
  • 加入我们
  • 官网邮箱:gitlink@ccf.org.cn
  • QQ群
  • QQ群
  • 公众号
  • 公众号

版权所有:中国计算机学会技术支持:开源发展技术委员会
京ICP备13000930号-9 京公网安备 11010802032778号