What happens when we clip batch_size ? #232
Labels
good first issue
Good for newcomers
help wanted
Extra attention is needed
optimization
An optimization of something that works
The batch size is clipped here --> https://github.com/SubstraFoundation/distributed-learning-contributivity/blob/b72fa98c0b4db45d368f577d0f6d1a861b1610c2/scenario.py#L584
So if it's clipped it means that sometimes we don't use all the data in the dataset. But we don't give any feedback to the user. We should check when it happens and inform somehow the user.
The text was updated successfully, but these errors were encountered: