Embedding — PyTorch 1.10.1 documentation
pytorch.org › generated › torchA simple lookup table that stores embeddings of a fixed dictionary and size. This module is often used to store word embeddings and retrieve them using indices. The input to the module is a list of indices, and the output is the corresponding word embeddings. Parameters. num_embeddings ( int) – size of the dictionary of embeddings.
CS224N_PyTorch_Tutorial - stanford.edu
stanford.edu › CS224N_PyTorch_TutorialNow that we have an index for each word in our vocabularly, we can create an embedding table with nn.Embedding class in PyTorch. It is called as follows nn.Embedding(num_words, embedding_dimension) where num_words is the number of words in our vocabulary and the embedding_dimension is the dimension of the embeddings we want to have.