You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
df_with_missing = prepare_training_data().iloc[:, : 12]
print("Null value in every column\n", df_with_missing.isnull().sum(axis=0))
# impute missing values
df_with_missing_imputed = datawig.SimpleImputer.complete(df_with_missing, precision_threshold=0.8)
print("Null value in every column\n", df_with_missing_imputed.isnull().sum(axis=0))
mainly two problems
Null values are the same before and after running model
If I run with the above 12 features, it is taking indefinite time to run ( I ran the code for 30 minutes, and still running)
When the precision threshold is set to values above 0.0 datawig will only impute values when it is 'certain' enough that its imputations will be correct, based on a precision threshold. If you set that threshold to 0.8, this means that only for imputations that reached 0.8, on an independent validation set, you will get an imputation. This threshold is calibrated for each value separately. So if datawig cannot impute values with reasonably high precision, you will have Nones/NaNs. If you'd like to have more imputations (with lower precision), you can lower the precision threshold.
As for the long runtime: the model selection / hyperparameter optimization can take a long time. You can try turning off the hpo or reduce the number of dimensions when calling complete
mainly two problems
versions
I have string, float, and integer data as input
Am I missing something?
@felixbiessmann
The text was updated successfully, but these errors were encountered: