Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What happens when we clip batch_size ? #232

Open
RomainGoussault opened this issue Sep 11, 2020 · 2 comments
Open

What happens when we clip batch_size ? #232

RomainGoussault opened this issue Sep 11, 2020 · 2 comments
Assignees
Labels
good first issue Good for newcomers help wanted Extra attention is needed optimization An optimization of something that works

Comments

@RomainGoussault
Copy link
Member

The batch size is clipped here --> https://github.com/SubstraFoundation/distributed-learning-contributivity/blob/b72fa98c0b4db45d368f577d0f6d1a861b1610c2/scenario.py#L584

So if it's clipped it means that sometimes we don't use all the data in the dataset. But we don't give any feedback to the user. We should check when it happens and inform somehow the user.

@bowni
Copy link
Member

bowni commented Sep 22, 2020

  • Update MAX_BATCH_SIZE (value to be determined)
  • Add a log when the clipping is triggered (log leve INFO)

@bowni bowni added optimization An optimization of something that works good first issue Good for newcomers help wanted Extra attention is needed labels Sep 22, 2020
@bowni bowni self-assigned this Nov 5, 2020
@arthurPignet
Copy link
Collaborator

I think that the MAX_BATCH_SIZE should be related to the GPU memory available, so it will be dataset dependant

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue Good for newcomers help wanted Extra attention is needed optimization An optimization of something that works
Projects
None yet
Development

No branches or pull requests

3 participants