Skip to content
/ FreeKD Public

[CVPR'24] Official implementation of paper "FreeKD: Knowledge Distillation via Semantic Frequency Prompt".

License

Notifications You must be signed in to change notification settings

Gumpest/FreeKD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

FreeKD: Knowledge Distillation via Semantic Frequency Prompt

🔥 Official implementation of paper "FreeKD: Knowledge Distillation via Semantic Frequency Prompt", CVPR 2024.

By Yuan Zhang, Tao Huang, Jiaming Liu, Tao Jiang, Kuan Cheng, Shanghang Zhang

mask

Installation

Install MMRazor 0.x

git clone -b 0.x https://github.com/open-mmlab/mmrazor.git
cd mmrazor
pip install -v -e .

Install Wavelets

git clone https://github.com/fbcotter/pytorch_wavelets
cd pytorch_wavelets
pip install .

Prepare Data Set

Download on https://opendatalab.com

Note

If you want to distill on detection and segmentation, you should install mmdetection and mmsegmentation, respectively.

Reproducing our results

Train students with FreeKD

This repo uses MMRazor as the knowledge distillation toolkit. For environment setup, please see docs/en/get_started.md.

Train student:

cd mmrazor
sh tools/mmdet/dist_train.sh ${CONFIG} 8 ${WORK_DIR}

Example for reproducing our freekd_retinanet_r101-retinanet_r50_coco result:

bash tools/mmdet/dist_train.sh configs/distill/freekd/freekd_retinanet_r101-retinanet_r50_coco.py 8 --work-dir work_dirs/freekd_retinanet_r101-retinanet_r50

Results

  • Baseline settings:

    Student Teacher FreeKD
    Faster RCNN-R50 (38.4) Faster RCNN-R101 (39.8) 40.8
    RetinaNet-R50 (37.4) RetinaNet-R101 (38.9) 39.9
    FCOS-R50 (38.5) FCOS-R101 (40.8) 42.9
  • Stronger teachers:

    Student Teacher FreeKD
    Faster RCNN-R50 (38.4) Cascade Mask RCNN-X101 (45.6) 42.4
    RetinaNet-R50 (37.4) RetinaNet-X101 (41.0) 41.0
    RepPoints-R50 (38.6) RepPoints-R101 (44.2) 42.4

Visualization

mask

License

This project is released under the Apache 2.0 license.

Citation

If you use FreeKD in your research, please cite our work by using the following BibTeX entry:

@article{zhang2023freekd,
  title={FreeKD: Knowledge Distillation via Semantic Frequency Prompt},
  author={Zhang, Yuan and Huang, Tao and Liu, Jiaming and Jiang, Tao and Cheng, Kuan and Zhang, Shanghang},
  journal={arXiv preprint arXiv:2311.12079},
  year={2023}
}

About

[CVPR'24] Official implementation of paper "FreeKD: Knowledge Distillation via Semantic Frequency Prompt".

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published