-
-
Notifications
You must be signed in to change notification settings - Fork 8.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bug in objective = "reg:pseudohubererror"
and xgb.plot.tree()
#10988
Comments
objective = "reg:pseudohubererror" or "reg:quantileerror"
objective = "reg:pseudohubererror" and "reg:quantileerror"
The "reg:quantileerror" objective was added in XGBoost 2.0, which isn't available on CRAN. You should install the R package from the source to use the feature. |
objective = "reg:pseudohubererror" and "reg:quantileerror"
objective = "reg:pseudohubererror"
and xgb.plot.tree()
@hcho3 thanks. @kashif @darxriggs Do you know why |
Hi all, I have found the key. It is about setting the base score. Switching initial prediction from 0.5 to a weakly informative Will the new support of "Intercept" https://xgboost.readthedocs.io/en/latest/tutorials/intercept.html in version 2.0 will solve this problem automatically? I think it is also important to document that the default See the impacts when # Solution: set base_score
library(xgboost)
library(tidyverse)
data(mtcars)
Data <- mtcars %>%
{xgb.DMatrix(
data = (.) %>% select(-mpg) %>% as.matrix(),
label = (.) %>% pull(mpg))}
Model <- xgboost(
data = Data,
objective = "reg:pseudohubererror",
base_score = median(mtcars$mpg),
max.depth = 3, eta = 1, nrounds = 100)
"[1] train-mphe:2.019801
[100] train-mphe:0.000000 "
Model <- xgboost(
data = Data,
objective = "reg:pseudohubererror", eval_metric = "mae",
base_score = median(mtcars$mpg),
max.depth = 3, eta = 1, nrounds = 100)
"[1] train-mae:2.685595
[100] train-mae:0.000482" |
Thank you for sharing, we will have to do some experiments once the R interface is ready. It's using median by default with the latest XGBoost, so I suspect it should work |
Hi @mattn, I wanted to use XGBoost for quantile regression but found that the loss function of pseudo Huber error does no better than a null model. Currently,
objective = 'reg:pseudohubererror'
predicts every case as 0.5, with no information learnt at all no matter how other parameters are specified.Also,
xgb.plot.tree()
shows nothing. The Viewer panel is blank.Further,
objective = "reg:quantileerror"
results in error although the online documentation mentions it https://xgboost.readthedocs.io/en/latest/parameter.html. I am using the latest R version 1.7.8.1.The text was updated successfully, but these errors were encountered: