Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Define interesting likelihood functions for testing #15

Open
williamjameshandley opened this issue Dec 21, 2018 · 2 comments
Open

Define interesting likelihood functions for testing #15

williamjameshandley opened this issue Dec 21, 2018 · 2 comments
Assignees
Labels
discussion General discussion points
Milestone

Comments

@williamjameshandley
Copy link
Member

williamjameshandley commented Dec 21, 2018

This can be done in parallel to #4 , in order to try to find a good neural network architecture @melli1992

@cbalazs
Copy link
Collaborator

cbalazs commented Dec 22, 2018

To get the ball rolling, here are a few initial suggestions.

  1. Goldstein-Price function:
    𝑓(𝑥1, 𝑥2) = [1 + (𝑥1 + 𝑥2 + 1)^2(19 − 14𝑥1 + 3𝑥1^2 − 14𝑥2 + 6𝑥1𝑥2 + 3𝑥2^2)]
    × [30 + (2𝑥1 − 3𝑥2)^2(18 − 32𝑥1 + 12𝑥1^2 + 48𝑥2 − 36𝑥1𝑥2 + 27𝑥2^2)]
    The Goldstein-Price function is a two-dimensional function designed to test the convergence rate of global optimisers since it is very flat near the global minimum. It is not a test function with many local minima/maxima and it's only 2 dimensional. (The typical domain of use is 𝑥𝑖 ∈ [−2, 2], 𝑖 = 1,2, and it has a global minimum of 𝑓(0, −1) = 3.)

  2. SpikeGrid function:
    𝑓(𝑥𝑖) = 1 − |prod sin(𝜋𝑥𝑙/2𝑐𝑖)|^𝑠 where the product goes for 𝑖 = 1, … , 𝑑
    SpikeGrid function consists of a grid of ‘spikes’ with 𝑠 controlling the the ‘sharpness’ of the spikes. It is similar to the egg-holder function but nastier since the spikes can be made arbitrarily sharp. It's disadvantage is that it's periodic, so it might be 'easy' to learn.

  3. Schwefel function:
    𝑓(𝑥𝑖) = 418.9829𝑑 −∑ 𝑥𝑖 sin(√|𝑥𝑖|) where the sum goes for 𝑖 = 1, … , 𝑑
    The Schwefel function is highly multi-modal, similar to the egg-holder function but better for testing since the depth and height of the minima and maxima varies with location. Although, it's still periodic. (The typical domain of use is 𝑥𝑖 ∈ [−500, 500], 𝑖 = 1, … , 𝑑, and it has a global minimum of 𝑓(420.9687, … , 420.9687) = 0.)

  4. CMSSM likelihood function. We cannot get more realistic than using a physical test case! GAMBIT sampled the CMSSM and calculated a likelihood function (including many observables) in 2017. The function is very well sampled: an order of 100 million samples were generated! The function is available for download at https://zenodo.org/communities/gambit-official/. Drawbacks are: the function is over a 4(real para)+~10(nuisances) dimensional space, the function is only available at discrete points (i.e. defined numerically), and the samples can be cumbersome to handle. Same likelihood functions are available for the NUHM1, NUHM2, the MSSM-7, the SM + scalar/fermion/vector singlet, and QCD axion models.

@williamjameshandley
Copy link
Member Author

williamjameshandley commented Dec 27, 2018

These are all good ideas @cbalazs . Other examples might include the ones in the MultiNest paper (e.g. rastrigin, rosenbrock, gaussian shell and himmelblau).

I think the best way to start on this would be to create a module pybambi/example_likelihoods.py, and define these functions in there. There should also be a method for generating samples from these distributions using MultiNest and PolyChord (interfaces are provided in the pybambi.MultiNest and pybambi.PolyChord modules), using the outputs of either the _equal_weights.txtor .txt files. You'll need a corresponding test files in tests/test_example_likelihoods.py to confirm that the functions produce the correct values for a small set of example inputs.

Once you've done that on a local fork, you should create a pull request where we can discuss any further changes to the code. Reference this issue with the tag #15

bstienen added a commit to DarkMachines/high-dimensional-sampling that referenced this issue Sep 16, 2019
Add optimisation functions suggested by Csaba (DarkMachines/pyBAMBI#15)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
discussion General discussion points
Projects
None yet
Development

No branches or pull requests

2 participants