Batch size for LSTM - PyTorch Forums
https://discuss.pytorch.org/t/batch-size-for-lstm/4761911.06.2019 · I am working on an encoder that uses LSTM def init_hidden(self, batch): ''' used to initialize the encoder (LSTMs) with number of layers, batch_size and hidden layer dimensions :param batch: batch size :return: ''' return ( torch.zeros(self.num_layers, batch, self.h_dim).cuda(), torch.zeros(self.num_layers, batch, self.h_dim).cuda() ) this is the code to initialize the LSTM, …
LSTM batch size vs sequence length - PyTorch Forums
discuss.pytorch.org › t › lstm-batch-size-vsJul 16, 2021 · I am new to PyTorch and am trying to do some time series prediction for speech. The dimension of each datapoint is 384. What I am confused about is whether the memory of the LSTM is separate for each sequence in the batch or whether the batch is basically treated as one long sequence. In other words, in a simplified example, suppose that the input to our LSTM is (batch size, sequence length ...
Batch size for LSTM - PyTorch Forums
discuss.pytorch.org › t › batch-size-for-lstmJun 11, 2019 · I am working on an encoder that uses LSTM def init_hidden(self, batch): ''' used to initialize the encoder (LSTMs) with number of layers, batch_size and hidden layer dimensions :param batch: batch size :return: ''' return ( torch.zeros(self.num_layers, batch, self.h_dim).cuda(), torch.zeros(self.num_layers, batch, self.h_dim).cuda() ) this is the code to initialize the LSTM, what does the ...