-
Notifications
You must be signed in to change notification settings - Fork 58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Caching and test for GP #224
Caching and test for GP #224
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Don't merge, injection test is failing randomly. |
Removed GRB model from injections.py test. Also, need to check the availability of caches from main branch to feature branch. |
Cache present on the main branch cannot be accessed by the feature branches. This means, whenever PR comes, new caches are added to the repo alongside the cache of the main branch. Commit 5b374d4 adds a post PR merge (and closed) cache deletion workflow. |
@mcoughlin This can be merged now. Tests are passing. Cache workflow will be in action after the merge. In another PR I will pin parameters for GRB. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM!
d570118
into
nuclear-multimessenger-astronomy:main
This PR implements the use of caching to save the SVD models as cache and use them for all the future runs without the need to download models everytime. Fetching models from Zenodo is really slow, so for now the
svdmodel.tar
is hosted on Potsdam server. Following this, the analysis test has been updated to perform test onBu2019lm
since it makes use of the model. Also, note that if the tests are being used as is, do not use or create (from a code) any local folder by the namesvdmodels
as this can cause problems.Note for admins: I would advise you to clear all the caches of this repo in order to have a fresh start.