You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hello,
When I tried to replicate your binary_adapter experiment using the VTAB-1k dataset, I was unable to reproduce the results that you reported. I would like to discuss some potential issues with the training configuration that might be causing this discrepancy.
In similar works like VPT and SSF, different hyper-parameters (such as lr_rate, weight-decay, drop-path, etc.) are utilized for various datasets within VTAB-1k. However, the train.sh script in the binary_adapter codebase doesn't seem to account for these variations and applies default hyperparameters universally.
Could you advise on whether I should:
Conduct a grid search to find the best hyperparameter set for each dataset?
Or, should I use the hyperparameter settings from another public work like SSF, for instance?
Your insights would be greatly appreciated as I continue my experiments.
Looking forward to your reply!
The text was updated successfully, but these errors were encountered:
In our experiments, we only searched for the scale factor. All the experiments are conducted on RTX3090 GPUs and may exhibit slight variations in results when executed on different devices. Further exploration of hyperparameters such as learning rate and weight decay could potentially enhance performance.
Hello,
When I tried to replicate your binary_adapter experiment using the VTAB-1k dataset, I was unable to reproduce the results that you reported. I would like to discuss some potential issues with the training configuration that might be causing this discrepancy.
In similar works like VPT and SSF, different hyper-parameters (such as lr_rate, weight-decay, drop-path, etc.) are utilized for various datasets within VTAB-1k. However, the train.sh script in the binary_adapter codebase doesn't seem to account for these variations and applies default hyperparameters universally.
Could you advise on whether I should:
Your insights would be greatly appreciated as I continue my experiments.
Looking forward to your reply!
The text was updated successfully, but these errors were encountered: