no_grad — PyTorch 1.10.1 documentation
pytorch.org › docs › stableContext-manager that disabled gradient calculation. Disabling gradient calculation is useful for inference, when you are sure that you will not call Tensor.backward (). It will reduce memory consumption for computations that would otherwise have requires_grad=True. In this mode, the result of every computation will have requires_grad=False, even when the inputs have requires_grad=True.
set_grad_enabled — PyTorch 1.10.1 documentation
pytorch.org › docs › stableset_grad_enabled will enable or disable grads based on its argument mode. It can be used as a context-manager or as a function. This context manager is thread local; it will not affect computation in other threads. Parameters. mode – Flag whether to enable grad (True), or disable (False). This can be used to conditionally enable gradients.