Skip to content

Commit

Permalink
update Readme
Browse files Browse the repository at this point in the history
  • Loading branch information
fabsig committed Aug 24, 2021
1 parent 553030b commit cec1dff
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ GPBoost: Combining Tree-Boosting with Gaussian Process and Mixed Effects Models
## Modeling background
The GPBoost library allows for combining tree-boosting with Gaussian process and grouped random effects models in order to leverage advantages of both techniques and to remedy drawbacks of these two modeling approaches.

#### Background on Gaussian process and grouped random effects models
### Background on Gaussian process and grouped random effects models

**Tree-boosting** has the following **advantages and disadvantages**:

Expand All @@ -53,7 +53,7 @@ The GPBoost library allows for combining tree-boosting with Gaussian process and
| - Modeling of dependency which, among other things, can allow for more efficient learning of the fixed effects (predictor) function | |
| - Grouped random effects can be used for modeling high-cardinality categorical variables | |

#### GPBoost and LaGaBoost algorithms
### GPBoost and LaGaBoost algorithms

The GPBoost library implements two algorithms for combining tree-boosting with Gaussian process and grouped random effects models: the **GPBoost algorithm** [(Sigrist, 2020)](http://arxiv.org/abs/2004.02653) for data with a Gaussian likelihood (conditional distribution of data) and the **LaGaBoost algorithm** [(Sigrist, 2021)](https://arxiv.org/abs/2105.08966) for data with non-Gaussian likelihoods.

Expand Down

0 comments on commit cec1dff

Please sign in to comment.