Start dataloader at specific batch_idx - PyTorch Forums
discuss.pytorch.org › t › start-dataloader-atApr 03, 2018 · I would like to start my data loader at a specific batch_idx. I want to be able to continue my training from the exact batch_idx where it stopped or crashed. I don’t use shuffling so it should be possible. The only solution I came up with is the naive running though the for loop until I get to where I want: start_batch_idx, ... = load_saved_training() for batch_idx, (data, target) in ...
Resume iterating dataloader from checkpoint batch_idx ...
discuss.pytorch.org › t › resume-iteratingNov 12, 2019 · Hi, I was wondering whether it is possible to resume iterating through a dataloader from a checkpoint. For example: dataloaders_dict = {phase: torch.utils.data.DataLoader(datasets_dict[phase], batch_size=args.batch_size, num_workers=args.num_workers, shuffle=False) for phase in ['train']} # make sure shuffling is false incase you restart if os.path.isdir(args.batchidx_checkpoint): checkpoint ...