Authors: Maofeng Tang · Andrei Cozma · Konstantinos Georgiou · Hairong Qi
This is a PyTorch implementation of our NeurIPS 2023 paper: Cross-Scale MAE: A Tale of Multi-Scale Exploitation in Remote Sensing
The run the pretraining on a single node, you can use use the the train.sh. Make sure you modify its contents to match your environment. Alternatively, you can use the main_pretrain.py directly.
For multi-gpu training, use the train_distributed.sh instead.
To run finetuning on a single node, you can use the finetune.sh. Make sure you modify its contents to match your environment. Alternatively, you can use the main_finetune.py directly.
For linear probing, use the linprobe.sh and main_linprobe.py instead.
Pretrained weights for the models used in the paper can be found here:
epochs | pre-trained checkpoint | md5 | |
---|---|---|---|
ViT-Base | 400 | download | 0c33995da85c112d9602f89d5b186bbc |
ViT-Large | 400 | download | e6e4f58c07bbbc4c4dd63fa26c644dd4 |
You would need to download the weights and place them in a folder named weights
in the root of the repository.
Code from this repository is inspired from the following repositories:
If you found our project helpful, please cite our paper:
@inproceedings{tang2023cross,
title={Cross-Scale MAE: A Tale of Multiscale Exploitation in Remote Sensing},
author={Tang, Maofeng and Cozma, Andrei Liviu and Georgiou, Konstantinos and Qi, Hairong},
booktitle={Thirty-seventh Conference on Neural Information Processing Systems},
year={2023}
}
This project is under the CC-BY-NC 4.0 license. See LICENSE for details.