python - Embedding in PyTorch creates embedding with norm ...
stackoverflow.com › questions › 66262652Feb 18, 2021 · This works by dividing each weight in the embedding vector by the norm of the embedding vector itself, and multiplying it by max_norm. In your example max_norm=1, hence it's equivalent to dividing by the norm. To answer the question you asked in the comment, you can obtain the embedding of a sentence (vector containing word indexes taken from your dictionary), with embedding(sentences), the norm using the 2 for loops above.
Embedding — PyTorch 1.10.1 documentation
pytorch.org › generated › torchEmbedding (n, d, max_norm = True) W = torch. randn ((m, d), requires_grad = True) idx = torch. tensor ([1, 2]) a = embedding. weight. clone @ W. t # weight must be cloned for this to be differentiable b = embedding (idx) @ W. t # modifies weight in-place out = (a. unsqueeze (0) + b. unsqueeze (1)) loss = out. sigmoid (). prod loss. backward ()