How could I shuffle train index in every epoch with BatchSampler(DistributedSampler) #14462
Unanswered
xuanyang19
asked this question in
code help: CV
Replies: 1 comment
-
it's not required. It will be done automatically for you within Lightning Trainer. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
How could I call set_epoch() on DistributedSampler in pytorch lightning? Although I set distributedsampler(shuffle = True), the train index is always the same at every epoch. What should I do to make the train index become different at every epoch?
my dataloader:
DataLoader(self.trainset,sampler=BatchSampler(DistributedSampler(self.trainset, shuffle=True), batch_size=self.batch_size,drop_last=False), num_workers=self.num_workers)
Beta Was this translation helpful? Give feedback.
All reactions