You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Is it possible to identify the features that most impact the "sensitivity" of treatment with SHAP values?
If I use, for example, the ClassTransformation model for uplift modeling, when I try to use the shap.Explainer(best_model.estimator), it requests the base estimator objetct and not the ClassTransformation object. So I think the SHAP values give the impact on the probability of the label and not the uplift prediction for the model. Am I right?
The text was updated successfully, but these errors were encountered:
Is it possible to identify the features that most impact the "sensitivity" of treatment with SHAP values?
If I use, for example, the ClassTransformation model for uplift modeling, when I try to use the shap.Explainer(best_model.estimator), it requests the base estimator objetct and not the ClassTransformation object. So I think the SHAP values give the impact on the probability of the label and not the uplift prediction for the model. Am I right?
The text was updated successfully, but these errors were encountered: