How to add padding mask to nn.TransformerEncoder module ...
https://discuss.pytorch.org/t/how-to-add-padding-mask-to-nn...08.12.2019 · I think, when using src_mask, we need to provide a matrix of shape (S, S), where S is our source sequence length, for example, import torch, torch.nn as nn q = torch.randn(3, 1, 10) # source sequence length 3, batch size 1, embedding size 10 attn = nn.MultiheadAttention(10, 1) # embedding size 10, one head attn(q, q, q) # self attention
How to add padding mask to nn.TransformerEncoder module ...
discuss.pytorch.org › t › how-to-add-padding-mask-toDec 08, 2019 · I think, when using src_mask, we need to provide a matrix of shape (S, S), where S is our source sequence length, for example, import torch, torch.nn as nn q = torch.randn(3, 1, 10) # source sequence length 3, batch size 1, embedding size 10 attn = nn.MultiheadAttention(10, 1) # embedding size 10, one head attn(q, q, q) # self attention
How to add padding mask to nn.TransformerEncoder ... - 简书
https://www.jianshu.com/p/5f24927f1f62How to add padding mask to nn.TransformerEncoder module? I think, when using src_mask, we need to provide a matrix of shape (S,S), where S is our source sequence length, for example,. import torch import torch.nn as nn q = torch.randn(3, 1, 10) # source sequence length 3, batch size 1, embedding size 10 attn = nn.MultiheadAttention(10, 1) # embedding size 10, one head …
How to add padding mask to nn.TransformerEncoder module? - 简书
www.jianshu.com › p › 5f24927f1f62How to add padding mask to nn.TransformerEncoder module? I think, when using src_mask, we need to provide a matrix of shape (S,S), where S is our source sequence length, for example, import torch import torch.nn as nn q = torch.randn (3, 1, 10) # source sequence length 3, batch size 1, embedding size 10 attn = nn.MultiheadAttention (10, 1 ...