Add argparse parametrization for the finetuning script #7
Labels
enhancement
New feature or request
good first issue
Good for newcomers
help wanted
Extra attention is needed
Similar to what is currently available in
download_model.py
, add Argparse with parameters infinetune_nli.py
for parameters:model_name
, default 'models/scibert', typestr
batch_size
, default 64, typeint
model_save_path
, default 'models/scibert_nli', typestr
num_epochs
, default 2, typeint
warmup_steps
, default None, not requireddo_mean_pooling
withaction='store_true'
do_cls_pooling
withaction='store_true'
do_max_pooling
withaction='store_true'
Then:
Add check for only one of the pooling condition to be verified (raise an
AttributeError
if more than one is). If none is specified, we use mean pooling strategy.Check if the
warmup_step
parameter is set before setting it to 10% of training: if it is, keep the user-defined value.The text was updated successfully, but these errors were encountered: