Adam+Half Precision = NaNs? - PyTorch Forums
discuss.pytorch.org › t › adam-half-precision-nansApr 09, 2017 · Hi guys, I’ve been running into the sudden appearance of NaNs when I attempt to train using Adam and Half (float16) precision; my nets train just fine on half precision with SGD+nesterov momentum, and they train just fine with single precision (float32) and Adam, but switching them over to half seems to cause numerical instability. I’ve fiddled with the hyperparams a bit; upping epsilon ...
torch.Tensor.half — PyTorch 1.10.1 documentation
pytorch.org › generated › torchLearn about PyTorch’s features and capabilities. Community. Join the PyTorch developer community to contribute, learn, and get your questions answered. Developer Resources. Find resources and get questions answered. Forums. A place to discuss PyTorch code, issues, install, research. Models (Beta) Discover, publish, and reuse pre-trained models
Using half precision - autograd - PyTorch Forums
discuss.pytorch.org › t › using-half-precisionSep 03, 2020 · Hi, I am new to using the half-precision for tensors in PyTorch. So I had a very basic question if it’s possible that in my neural network model I can have some variables as half tensors and some as normal full precision tensors? Basically my model is taking too much memory so instead of decreasing the batch size, I wanted to check if it’s possible to make some variables as half-precision ...