Skip to content

Latest commit

 

History

History
21 lines (17 loc) · 589 Bytes

README.md

File metadata and controls

21 lines (17 loc) · 589 Bytes

DTKD

This is the Pytorch and RecBole implementation for our paper: Dual-Teacher Knowledge Distillation for Strict Cold-Start Recommendation.

How to use:

  1. First make sure to creat a 'saved' file in the following structure:
- saved
  - DirectAUT
  - LinearT
  - MLPS
  - ml-1m
  1. Train two types of teachers python run_recbole.py -r cs -m DirectAUT python run_recbole.py -r cs -m LinearT

  2. Find the saved model paths for DirectAUT and LinearT in the saved file and copy the paths into the config.yaml Then run the DTKD algorithm: python run_recbole.py -r kd