Using dataloader to sample with replacement in pytorch ...
stackoverflow.com › questions › 69681459Oct 22, 2021 · During the training, I would like to sample batches of m training samples, with replacement; e.g. the first iteration includes data indices [1, 5, 6], second iteration includes data points [12, 3, 5], and so on and so forth. So the total number of iterations is an input, rather than N/m. Is there a way to use dataloader to handle this? If not ...
Sampling with replacement - PyTorch Forums
discuss.pytorch.org › t › sampling-with-replacementOct 03, 2018 · I’m trying to work out whether the torch.utils.data.WeightedRandomSampler class will still cover all available data inputs provided a long enough training period when choosing sampling with replacement. Given that WeightedRandomSampler requires shuffle=False in the DataLoader, does that mean that WeightedRandomSampler will observe the entire sampling array (which is paired to the data thanks ...