If I am using a dataloader in Pytorch and want to define something that needs the size of the current batch, how do I access it?
The issue I have with using my defined batch size(say, r) is suppose the dataset is 1009 long, but my r=100 (in a generic function). How do I ensure that the last batch doesn't throw error due to mismatch in dimensions (100 vs 9)?
drop_last
in the dataloader creation in order to drop the trailing batch with the non-specified size, see link.