Loading n-dimensional tensor for pytorch from text file
https://stackoverflow.com/questions/6491385419.11.2020 · I have a 3-dimensional tensor that I create outside of my python code and am able to encode in any human-readable format. I need to load the values of this tensor as a frozen layer to my pytorch NN. I've tried to encode the tensor as a text file in the form [[[a,b],[c,d]], [[e,f], [g,h]], [[k,l],[m,n]]] which seemed to be the most logical way for that.
torch.Tensor — PyTorch 1.10.1 documentation
https://pytorch.org/docs/stable/tensorstorch.ByteTensor. /. 1. Sometimes referred to as binary16: uses 1 sign, 5 exponent, and 10 significand bits. Useful when precision is important at the expense of range. 2. Sometimes referred to as Brain Floating Point: uses 1 sign, 8 exponent, and 7 significand bits. Useful when range is important, since it has the same number of exponent bits ...