Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Distributed training setup #4

Draft
wants to merge 27 commits into
base: main
Choose a base branch
from

Conversation

MicPie
Copy link

@MicPie MicPie commented Jan 12, 2022

PR for the distributed training setup.

…ytorch (state commit: 5a255eab032bcd821c2038c808b9682e485b3f1a)
@MicPie
Copy link
Author

MicPie commented Jan 12, 2022

Packages I currently work on:

  • grad cache
  • pytorch AMP FP16 training
  • lr schedule

Other packages that will be needed:

  • review and check (web)dataset setup incl. text mask output and validation dataset
  • add accuracy logging
  • add ImageNet eval
  • 8bit adam/zero optimizer
  • test horovod training if needed
  • test deep speed training if needed
  • see small TO DOs in the code base

Other stuff:

  • add Hopfield network for CLOOB (InfoLOOB is there)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant