Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Additional Dependencies #372

Merged
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
205 changes: 135 additions & 70 deletions .github/workflows/continous_integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -4,85 +4,150 @@ on:
push:
branches: [main]
paths-ignore:
- 'doc/**'
- "doc/**"
pull_request:
branches: [main]
paths-ignore:
- 'doc/**'
- "doc/**"
workflow_dispatch:

jobs:

build:

runs-on: ubuntu-latest
runs-on: ${{ matrix.os }}
defaults:
run:
shell: bash -el {0}
strategy:
fail-fast: false
matrix:
python-version: ['3.9', '3.10', '3.11', '3.12']
os: ["ubuntu-latest", "macos-latest"]
python-version: ["3.9", "3.10", "3.11", "3.12"]

steps:
- name: Checkout branch being tested
uses: actions/checkout@v4

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

- name: Restore SVD models from cache
uses: actions/cache@v4
with:
path: svdmodels
key: svdmodels-${{ runner.os }}-${{ hashFiles('**/LICENSE') }}
restore-keys: svdmodels-${{ runner.os }}-

- name: Get pip cache dir
id: pip-cache
run: |
python -m pip install --upgrade pip setuptools wheel
echo "::set-output name=dir::$(pip cache dir)"
- name: pip cache
uses: actions/cache@v4
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-pip-2-${{ hashFiles('**/pyproject.toml', '**/requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-2-
save-always: true

- name: Install Linux Packages with caching
uses: awalsh128/cache-apt-pkgs-action@latest
with:
packages: openmpi-bin libopenmpi-dev gfortran build-essential libblas3 libblas-dev liblapack3 liblapack-dev libatlas-base-dev texlive texlive-latex-extra texlive-fonts-recommended dvipng cm-super
execute_install_scripts: true

- name: Install dependencies
run: |
sudo apt-get update
sudo apt-get install -y texlive texlive-latex-extra texlive-fonts-recommended dvipng cm-super
python -m pip install --upgrade git+https://github.com/bitranox/wrapt_timeout_decorator.git
python -m pip install pytest pytest-cov flake8 pytest-aiohttp sqlparse freezegun PyJWT joblib coveralls
python -m pip install -r ml_requirements.txt -r grb_requirements.txt
python -m pip install .
git clone https://github.com/JohannesBuchner/MultiNest && cd MultiNest/build && cmake .. && make && cd ../..
pwd

- name: Lint with flake8
run: |
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics --exclude docs
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics --exclude docs,versioneer.py,nmma/_version.py,nmma/tests,nmma/*/__init__.py

- name: Test with pytest
run: |
python -m coverage run --source nmma -m pytest nmma/tests/*.py
env:
LD_LIBRARY_PATH: .:/home/runner/work/nmma/nmma/MultiNest/lib # for Linux
- name: Run Coveralls
if: ${{ success() }}
run: |
coveralls --service=github
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Checkout branch being tested
uses: actions/checkout@v4

- name: Set up Python with Conda ${{ matrix.python-version }}
uses: conda-incubator/setup-miniconda@v3
with:
python-version: ${{ matrix.python-version }}
architecture: ${{ matrix.os == 'macos-latest' && 'arm64' || 'x86_64' }}
miniforge-version: latest
use-mamba: true
mamba-version: "*"
activate-environment: nmma_env

- name: Restore SVD models from cache
uses: actions/cache@v4
with:
path: svdmodels
key: svdmodels-${{ hashFiles('**/LICENSE') }}
restore-keys: svdmodels-
save-always: true

- name: Get pip cache dir
id: pip-cache
run: |
python -m pip install --upgrade pip setuptools wheel
echo "::set-output name=dir::$(pip cache dir)"

- name: pip cache
uses: actions/cache@v4
with:
path: ${{ steps.pip-cache.outputs.dir }}
key: ${{ runner.os }}-pip-2-${{ hashFiles('**/pyproject.toml', '**/requirements.txt') }}
restore-keys: |
${{ runner.os }}-pip-2-
save-always: true

- name: Update Homebrew
if: matrix.os == 'macos-latest'
run: |
brew update --preinstall

- name: Configure Homebrew cache
if: matrix.os == 'macos-latest'
uses: actions/cache@v4
with:
path: |
~/Library/Caches/Homebrew/openmpi--*
~/Library/Caches/Homebrew/downloads/*--openmpi-*
~/Library/Caches/Homebrew/hdf5--*
~/Library/Caches/Homebrew/downloads/*--hdf5-*
~/Library/Caches/Homebrew/gcc--*
~/Library/Caches/Homebrew/downloads/*--gcc-*
~/Library/Caches/Homebrew/openblas--*
~/Library/Caches/Homebrew/downloads/*--openblas-*
~/Library/Caches/Homebrew/lapack--*
~/Library/Caches/Homebrew/downloads/*--lapack-*
~/Library/Caches/Homebrew/basictex--*
~/Library/Caches/Homebrew/downloads/*--basictex-*
~/Library/Caches/Homebrew/cmake--*
~/Library/Caches/Homebrew/downloads/*--cmake-*

key: brew-${{ hashFiles('**/pyproject.toml') }}
restore-keys: brew-
save-always: true

- name: Install Homebrew dependencies
if: matrix.os == 'macos-latest'
run: |
env HOMEBREW_NO_AUTO_UPDATE=1 brew install openmpi hdf5 gcc openblas lapack basictex cmake
eval "$(/usr/libexec/path_helper)"
sudo tlmgr update --self
sudo tlmgr install collection-latex
sudo tlmgr install collection-fontsrecommended
sudo tlmgr install dvipng
sudo tlmgr install cm-super
sudo tlmgr install type1cm
- name: Install Linux Dependencies
if: matrix.os == 'ubuntu-latest'
uses: awalsh128/cache-apt-pkgs-action@latest
with:
packages: openmpi-bin libopenmpi-dev gfortran build-essential libblas3 libblas-dev liblapack3 liblapack-dev libatlas-base-dev texlive texlive-latex-extra texlive-fonts-recommended dvipng cm-super
execute_install_scripts: true

- name: Install dependencies (macOS)
if: matrix.os == 'macos-latest'
run: |
conda install -c conda-forge pyfftw c-compiler ligo-segments python-ligo-lw
echo "FC=$(which gfortran-11)" >> $GITHUB_ENV
echo "CC=$(which gcc)" >> $GITHUB_ENV
echo "CXX=$(which g++)" >> $GITHUB_ENV

- name: Install dependencies (Ubuntu)
if: matrix.os == 'ubuntu-latest'
run: |
sudo apt-get update
sudo apt-get install -y texlive texlive-latex-extra texlive-fonts-recommended dvipng cm-super
- name: Install Python dependencies
run: |
python -m pip install --upgrade git+https://github.com/bitranox/wrapt_timeout_decorator.git
python -m pip install pytest pytest-cov flake8 pytest-aiohttp sqlparse freezegun PyJWT joblib coveralls
python -m pip install -r ml_requirements.txt -r grb_requirements.txt -r tf_requirements.txt -r sklearn_requirements.txt
python -m pip install .
git clone https://github.com/JohannesBuchner/MultiNest && cd MultiNest/build && rm -rf * && cmake .. && make && cd ../..
pwd

- name: Export Libraries
run: |
echo "LD_LIBRARY_PATH=$HOME/work/nmma/nmma/MultiNest/lib:$LD_LIBRARY_PATH" >> $GITHUB_ENV
echo "DYLD_LIBRARY_PATH=$HOME/work/nmma/nmma/MultiNest/lib:$DYLD_LIBRARY_PATH" >> $GITHUB_ENV

- name: Lint with flake8
run: |
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics --exclude docs
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics --exclude docs,versioneer.py,nmma/_version.py,nmma/tests,nmma/*/__init__.py

- name: Test with pytest
run: |
python -m coverage run --source nmma -m pytest nmma/tests/*.py

- name: Run Coveralls
if: ${{ success() }}
run: |
coveralls --service=github
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
85 changes: 85 additions & 0 deletions .github/workflows/dependency-installation.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,85 @@
name: Check NMMA dependencies installation
on:
push:
branches: [main]
paths: ["*_requirements.txt", "pyproject.toml"]
pull_request:
branches: [main]
paths: ["*_requirements.txt", "pyproject.toml"]
workflow_dispatch:

jobs:
build:
runs-on: ${{ matrix.os }}
defaults:
run:
shell: bash -el {0}
strategy:
fail-fast: false
matrix:
os: ["ubuntu-latest", "macos-latest"]
python-version: ["3.9", "3.10", "3.11", "3.12"]

steps:
- name: Checkout branch being tested
uses: actions/checkout@v4

- name: Set up Python with Conda ${{ matrix.python-version }}
uses: conda-incubator/setup-miniconda@v3
with:
python-version: ${{ matrix.python-version }}
architecture: ${{ matrix.os == 'macos-latest' && 'arm64' || 'x86_64' }}
miniforge-version: latest
use-mamba: true
mamba-version: "*"
activate-environment: nmma_env

- name: Install linux packages
if: matrix.os == 'ubuntu-latest'
run: |
sudo apt-get update
sudo apt-get install -y openmpi-bin libopenmpi-dev

- name: Install Homebrew packages
if: matrix.os == 'macos-latest'
run: |
brew install openmpi hdf5
conda install ligo-segments python-ligo-lw

- name: Install NMMA (base)
run: |
python -m pip install 'nmma @ git+https://github.com/sahiljhawar/nmma.git@dependencies'

- name: Verify base NMMA installation
run: |
python -c "import nmma; print(f'NMMA version: {nmma.__version__}')"

- name: Install and verify NMMA [grb]
run: |
python -m pip install 'nmma[grb] @ git+https://github.com/sahiljhawar/nmma.git@dependencies'
python -c "import nmma, afterglowpy; print(f'Successfully imported {nmma.__name__}, {afterglowpy.__name__}')"

- name: Install and verify NMMA [production]
run: |
python -m pip install 'nmma[production] @ git+https://github.com/sahiljhawar/nmma.git@dependencies'
python -c "import nmma, parallel_bilby, nestcheck, mpi4py; print(f'Successfully imported {nmma.__name__}, {parallel_bilby.__name__}, {nestcheck.__name__}, {mpi4py.__name__}')"

- name: Install and verify NMMA [neuralnet]
run: |
python -m pip install 'nmma[neuralnet] @ git+https://github.com/sahiljhawar/nmma.git@dependencies'
python -c "import nmma, torch, nflows, torchvision; print(f'Successfully imported {nmma.__name__}, {torch.__name__}, {nflows.__name__}, {torchvision.__name__}')"

- name: Install and verify NMMA [tf]
run: |
python -m pip install 'nmma[tf] @ git+https://github.com/sahiljhawar/nmma.git@dependencies'
python -c "import nmma, tensorflow; print(f'Successfully imported {nmma.__name__}, {tensorflow.__name__}')"

- name: Install and verify NMMA [sklearn]
run: |
python -m pip install 'nmma[sklearn] @ git+https://github.com/sahiljhawar/nmma.git@dependencies'
python -c "import nmma, sklearn; print(f'Successfully imported {nmma.__name__}, {sklearn.__name__}')"

- name: Install and verify NMMA [sampler]
run: |
python -m pip install 'nmma[sampler] @ git+https://github.com/sahiljhawar/nmma.git@dependencies'
python -c "import nmma, ultranest; print(f'Successfully imported {nmma.__name__}, {ultranest.__name__}')"
3 changes: 3 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,9 @@ optional-dependencies.doc = { file = ["doc_requirements.txt"] }
optional-dependencies.grb = { file = ["grb_requirements.txt"] }
optional-dependencies.production = { file = ["production_requirements.txt"] }
optional-dependencies.neuralnet = { file = ["ml_requirements.txt"] }
optional-dependencies.sklearn = { file = ["sklearn_requirements.txt"] }
optional-dependencies.tf = { file = ["tf_requirements.txt"] }
optional-dependencies.sampler = { file = ["sampler_requirements.txt"] }

[project.scripts]
nmma-generation = "nmma.pbilby.generation:main_nmma"
Expand Down
5 changes: 1 addition & 4 deletions requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,12 @@ matplotlib>=2.0,<3.8
scipy>=1.10
pandas>=1.3.4,<2.0
astropy>=4.3.1
scikit-learn>=1.0.2
pymultinest
sncosmo
dust_extinction
arviz
p_tqdm
ultranest
tornado
notebook
ligo.skymap
healpy
tensorflow
healpy
1 change: 1 addition & 0 deletions sampler_requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
ultranest
1 change: 1 addition & 0 deletions sklearn_requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
scikit-learn>=1.0.2
1 change: 1 addition & 0 deletions tf_requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
tensorflow
Loading