Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Caching and test for GP #224

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
63 changes: 51 additions & 12 deletions .github/workflows/continous_integration.yml
Original file line number Diff line number Diff line change
@@ -1,18 +1,51 @@
# This workflow will install Python dependencies, run tests and lint with a variety of Python versions
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-python-with-github-actions

name: Unit Test nmma

on:
push:
branches: [ main ]
branches: [main]
pull_request:
branches: [ main ]
branches: [main]
workflow_dispatch:

jobs:

download-files:
runs-on: ubuntu-latest

steps:
- name: Checkout branch being tested
uses: actions/checkout@v3

- name: Cache svdmodels directory
uses: actions/cache@v3
with:
path: svdmodels
key: svdmodels-${{ runner.os }}-${{ hashFiles('**/LICENSE') }}
restore-keys: svdmodels-${{ runner.os }}-

- name: Check if SVD models exist
id: check-artifact-download
run: |
if [ -d "svdmodels" ]; then
echo "Artifacts exist, skipping download."
echo "::set-output name=exists::true"
else
echo "Artifacts do not exist, will download."
echo "::set-output name=exists::false"
fi
- name: Download SVD models
if: steps.check-artifact-download.outputs.exists == 'false'
run: |
wget --show-progress -O svdmodels.tar https://enlil.gw.physik.uni-potsdam.de/~jhawar/svdmodels/svdmodels.tar
mkdir -p svdmodels
tar -xvf svdmodels.tar -C svdmodels/
rm svdmodels.tar
working-directory: ${{ github.workspace }}

build:

runs-on: ubuntu-latest
needs: download-files
strategy:
fail-fast: false
matrix:
Expand All @@ -26,28 +59,34 @@ jobs:
POSTGRES_PASSWORD: anything
ports:
- 5432:5432
# needed because the postgres container does not provide a
# healthcheck
options: >-
--health-cmd pg_isready
--health-interval 10s
--health-timeout 5s
--health-retries 5

steps:
- name: Checkout branch being tested
uses: actions/checkout@v2
uses: actions/checkout@v3

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}

- name: Restore SVD models from cache
uses: actions/cache@v3
with:
path: svdmodels
key: svdmodels-${{ runner.os }}-${{ hashFiles('**/LICENSE') }}
restore-keys: svdmodels-${{ runner.os }}-

- name: Get pip cache dir
id: pip-cache
run: |
python -m pip install --upgrade pip setuptools wheel
echo "::set-output name=dir::$(pip cache dir)"
- name: pip cache
uses: actions/cache@v2
uses: actions/cache@v3
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-pip-2-${{ hashFiles('**/setup.py', '**/requirements.txt') }}
Expand Down Expand Up @@ -77,7 +116,7 @@ jobs:
psql -U nmma -h localhost -c "GRANT ALL PRIVILEGES ON DATABASE nmma TO nmma;" nmma
- name: Test with pytest
run: |
python -m coverage run --source nmma -m pytest nmma/tests/*
python -m coverage run --source nmma -m pytest nmma/tests/*.py
env:
LD_LIBRARY_PATH: .:/home/runner/work/nmma/nmma/MultiNest/lib # for Linux
- name: Run Coveralls
Expand Down
33 changes: 33 additions & 0 deletions .github/workflows/delete-branch-cache.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
name: Cleanup cache for PR
on:
pull_request:
types:
- closed

jobs:
cleanup:
runs-on: ubuntu-latest
steps:
- name: Check out code
uses: actions/checkout@v3

- name: Cleanup
run: |
gh extension install actions/gh-actions-cache

REPO=${{ github.repository }}
BRANCH="refs/pull/${{ github.event.pull_request.number }}/merge"

echo "Fetching list of cache key"
cacheKeysForPR=$(gh actions-cache list -R $REPO -B $BRANCH -L 100 | cut -f 1 )

## Setting this to not fail the workflow while deleting cache keys.
set +e
echo "Deleting caches..."
for cacheKey in $cacheKeysForPR
do
gh actions-cache delete $cacheKey -R $REPO -B $BRANCH --confirm
done
echo "Done"
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
2 changes: 1 addition & 1 deletion nmma/em/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
import numpy as np
import os
import pandas as pd
from scipy.interpolate import interpolate as interp
import scipy.interpolate as interp
import scipy.signal
import scipy.constants
import scipy.stats
Expand Down
23 changes: 13 additions & 10 deletions nmma/tests/analysis.py
Original file line number Diff line number Diff line change
Expand Up @@ -3,22 +3,24 @@

from ..em import analysis


def test_analysis():

dataDir = f"{os.path.dirname(__file__)}/data"
workingDir=os.path.dirname(__file__)

dataDir = os.path.join(workingDir, 'data')
svdmodels=os.path.join(workingDir, '../../svdmodels/')

args = Namespace(
model="Me2017",
model="Bu2019lm",
interpolation_type="sklearn_gp",
svd_path="svdmodels",
svd_path=svdmodels,
outdir="outdir",
label="injection",
trigger_time=None,
data=None,
prior="priors/Me2017.prior",
prior="priors/Bu2019lm.prior",
tmin=0.1,
tmax=20.0,
tmax=10.0,
dt=0.5,
log_space_time=False,
photometric_error_budget=0.1,
Expand All @@ -30,13 +32,13 @@ def test_analysis():
Ebv_max=0.0,
grb_resolution=5,
jet_type=0,
error_budget="1",
error_budget="0",
sampler="pymultinest",
cpus=1,
nlive=512,
nlive=64,
reactive_sampling=False,
seed=42,
injection=f"{dataDir}/injection.json",
injection=f"{dataDir}/Bu2019lm_injection.json",
injection_num=0,
injection_detection_limit=None,
injection_outfile="outdir/lc.csv",
Expand Down Expand Up @@ -66,6 +68,7 @@ def test_analysis():
sample_over_Hubble=False,
sampler_kwargs="{}",
verbose=False,
local_only=True
)

analysis.main(args)
analysis.main(args)
Loading
Loading