Skip to content

Commit

Permalink
Adding neural network inference to NMMA (#370)
Browse files Browse the repository at this point in the history
* Update analysis.py

changed main to have a neural net option

* Update analysis.py

started adding nn analysis function

* Create embedding.py

* Create resnet.py

* Create normalizingflows.py

* Create dataprocessing.py

* Update analysis.py

* Update analysis.py

* Update analysis.py

indentation error

* Update analysis.py

added missing colon

* Update analysis.py

added print statements

* Update analysis.py

fixed indentation

* Update analysis.py

commented out code for debugging

* Update analysis.py

removed all added code

* Update analysis.py

adding print statements

* Update analysis.py

changed logic in def main

* Update normalizingflows.py

defined num points

* Update analysis.py

added missing colon

* Update analysis.py

changed main function

* Update analysis.py

uncommented filter requirements

* Update analysis.py

added print statement

* Update analysis.py

added dt logic

* Update analysis.py

set up logger and outdir

* Update analysis.py

changed tmin tmax logic

* Update analysis.py

added tmin, tmax, num points, current points

* Update analysis.py

adding injection file option

* Update analysis.py

changed print statement

* Update analysis.py

fixed typo

* Update analysis.py

light curve generation

* Update analysis.py

adding print statements

* Update analysis.py

missing parenthesis

* Update analysis.py

added point specification

* Update analysis.py

creating dataframe

* Update analysis.py

formatting data into df needed for further analysis

* Update analysis.py

fixed indentation

* Update analysis.py

debugging

* Update analysis.py

debugging

* Update analysis.py

* Update analysis.py

debugging

* Update analysis.py

making df

* Update analysis.py

fixed syntax error

* Update analysis.py

added detection limit arg logic

* Update analysis.py

made pre padding dict and df

* Update analysis.py

syntax error

* Update analysis.py

checking padding functionality

* Update dataprocessing.py

made filler functions more flexible to other dataframes, removed column hardcoding

* Update analysis.py

importing padding functions

* added mlmodel

* Delete mlmodel directory

delete copy

* Update analysis.py

importing data processing functions

* Update requirements.txt

added nflows package

* Update requirements.txt

added nflows, torch

* Update dataprocessing.py

change variable names

* Update analysis.py

finished data padding

* Update requirements.txt

* Update requirements.txt

* Update requirements.txt

can't install torch

* Update requirements.txt

reverted back to original requirements

* Update analysis.py

added imports

* Update analysis.py

importing resnet

* Update analysis.py

importing more functions

* Update analysis.py

added functionality for inference

* Create inference.py

* Update analysis.py

added more checks and arg functionality

* Update analysis.py

added functionality if injection parameters are provided

* Update inference.py

added functionality if truth is not available.

* Update inference.py

fixed syntax error

* Update inference.py

fixed variable name error

* Update analysis.py

* Update dataprocessing.py

* Update embedding.py

* Update normalizingflows.py

* Update analysis.py

fixed syntax error

* Update analysis.py

made column list an arg of pad the data func

* Update dataprocessing.py

made column list an arg for padding functions

* Update analysis.py

added print statements

* Update analysis.py

* Add files via upload

adding similarity embedding weights

* Add files via upload

normalizing flow weights

* Update analysis.py

updated weights filepath

* Update analysis.py

added context features

* Update analysis.py

added flow

* Update analysis.py

typo in flow weight path

* Update analysis.py

fixed logic

* Update analysis.py

saving corner plot

* Update analysis.py

more savefig args added

* Update analysis.py

debugging save plot issue

* Update analysis.py

import matplotlib pyplot

* Update analysis.py

trying to save figure

* Update analysis.py

fixed savepath

* Update analysis.py

* Update analysis.py

removing print statements

* Update analysis.py

making sure filters is defined

* Update analysis.py

* Update analysis.py

* Update analysis.py

* Update analysis.py

debugging

* Update analysis.py

adding exceptions

* Update analysis.py

* Create ml_requirements.txt

* Create lfi_analysis.md

* Update lfi_analysis.md

added cli examples

* Update lfi_analysis.md

added instructions for setting up env

* Update analysis.py

added test function for nn_analysis

* Update continous_integration.yml

* Update continous_integration.yml

* Update analysis.py

* add lfi to toctree

* local_only to False in nn

* add injection file and injection path

* update pyproject and add torchvision to req

---------

Co-authored-by: Sahil Jhawar <55475299+sahiljhawar@users.noreply.github.com>
Co-authored-by: Sahil Jhawar <sahil.jhawar448@gmail.com>
  • Loading branch information
3 people authored Jul 18, 2024
1 parent 904ec5b commit 9e88cff
Show file tree
Hide file tree
Showing 15 changed files with 1,824 additions and 9 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/continous_integration.yml
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,7 @@ jobs:
sudo apt-get update
sudo apt install -y openmpi-bin libopenmpi-dev gfortran build-essential libblas3 libblas-dev liblapack3 liblapack-dev libatlas-base-dev texlive texlive-latex-extra texlive-fonts-recommended dvipng cm-super
python -m pip install --upgrade git+https://github.com/bitranox/wrapt_timeout_decorator.git
python -m pip install pytest pytest-cov flake8 pytest-aiohttp sqlparse freezegun PyJWT joblib tensorflow afterglowpy coveralls
python -m pip install pytest pytest-cov flake8 pytest-aiohttp sqlparse freezegun PyJWT joblib tensorflow afterglowpy coveralls nflows torch torchvision
python -m pip install .
git clone https://github.com/JohannesBuchner/MultiNest && cd MultiNest/build && cmake .. && make && cd ../..
pwd
Expand Down
2 changes: 2 additions & 0 deletions doc/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -493,6 +493,8 @@ User Guide
Cluster_Resources
contributing
changelog
lfi_analysis


.. Indices and tables
.. ==================
Expand Down
38 changes: 38 additions & 0 deletions doc/lfi_analysis.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Perform Parameter Estimation Using Liklihood Free Inference (LFI)

NMMA is adding machine learning functionality to its currently offered analysis methods. In this initial incorporation, a neural network approach will perform parameter estimation on light curves from BNS events. We will address the limitations first, and then provide an example run.

## Limitations

1. Requires use of the Ka2017 model
2. Requires use of 3 filters, optimally performs best when given ztfg,ztfr,ztfi
3. Requires a time step dT of 0.25

## Functionality

1. Returns a posterior when a light curve is given
2. Returns a posterior when an injection file is given

## Example

### Set up the environment

In addition to installing the standard requirements.txt file, you must also run pip install -r ml_requirments.txt to fulfill the necessary package requirements.

### Generate a simulation set

First, we will create an injection file that describes our light curves. Running the following command line will generate a json file (injection.json). For Ka2017, this will include the parameters: luminosity_distance, timeshift, log10_mej, log10_vej, log10_Xlan, and geocent_time.

nmma_create_injection --prior-file ./priors/Ka2017 --eos-file ./example_files/eos/ALF2.dat --binary-type BNS --filename ./output/injection --n-injection 10 --original-parameters --extension json

We can generate light curves using this injection file with the following command. Here, we can define the start and ending time of the light curve, its filters, and we can add ztf-like noise.

lightcurve-generation --model Ka2017 --outdir outdir --outfile-type json --label test --tmin -2 --tmax 20 --dt 0.25 --filters ztfg,ztfr,ztfi --injection ./outdir/injection.json --injection-detection-limit 22.0,22.0,22.0 --ztf-uncertainties

There are two options when running the analysis. One can run it using the injection file directly, which will cause the analysis to generate a light curve on the fly. Or, the command will accept a pre-generated light curve. To run using LFI, the --sampler argument must be neuralnet. In the first example, we call the injection file and pass the first injection to the analysis. In the second, we take the first light curve generated by the injection file. Both of these commands will output a posterior plot to the provided outdir.

lightcurve-analysis --sampler neuralnet --model Ka2017 --outdir inferences --label with_inj --prior priors/Ka2017.prior --injection outdir/injection.json --injection-num 0 --dt 0.25

lightcurve-analysis --sampler neuralnet --model Ka2017 --outdir inferences --label with_data --data outdir/test_0.json --prior priors/Ka2017.prior


3 changes: 3 additions & 0 deletions ml_requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
torch
nflows
torchvision
Loading

0 comments on commit 9e88cff

Please sign in to comment.