Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Offline run of DiffDock-pp #5

Open
slieped opened this issue Apr 24, 2023 · 1 comment
Open

Offline run of DiffDock-pp #5

slieped opened this issue Apr 24, 2023 · 1 comment

Comments

@slieped
Copy link

slieped commented Apr 24, 2023

Consider to add utilities / modify code to work with offline computing resources.
Most HPC do not have direct internet conection to internet.
Thus, the use of torch.hub to download ESM model, might be problematic!

I came up with a simple solution that could be integrated (or at least mentioned in furhter examples)

  1. Instal esm package via pip:
    pip install fair-esm
    (https://github.com/facebookresearch/esm)

  2. Download model and regression .pt files:
    https://dl.fbaipublicfiles.com/fair-esm/models/esm2_t33_650M_UR50D.pt
    https://dl.fbaipublicfiles.com/fair-esm/regression/esm2_t33_650M_UR50D-contact-regression.pt

  3. Import esm function to load precomputed models:
    from esm.pretrained import load_model_and_alphabet_local

  4. Modify data.train.utils.compute_emedding function [325-327]:
    modelpath = 'path/to/model/esm2_t33_650M_UR50D.pt'
    esm_model, alphabet = load_model_and_alphabet_local(modelpath)

I do know that with torch.hub its possible to pre-cache the files. And then just load the pre-downloaded ones, but its not ideal
This is just a consideration, to make the tool more scalable and useful for other teams!

Victor M

@ketatam
Copy link
Owner

ketatam commented Apr 30, 2023

Hi!

Thanks a lot for the detailed suggestion. I will test your approach and then update the README.md accordingly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants