0

If I am using a dataloader in Pytorch and want to define something that needs the size of the current batch, how do I access it?

The issue I have with using my defined batch size(say, r) is suppose the dataset is 1009 long, but my r=100 (in a generic function). How do I ensure that the last batch doesn't throw error due to mismatch in dimensions (100 vs 9)?

1
  • 1
    As a side note, you can set the parameter drop_last in the dataloader creation in order to drop the trailing batch with the non-specified size, see link.
    – jhso
    Commented Nov 28, 2023 at 23:18

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Browse other questions tagged or ask your own question.