Seq2Seq with Attention and Beam Search
guillaumegenthial.github.io › sequence-to-sequenceNov 08, 2017 · Seq2Seq with Attention The previous model has been refined over the past few years and greatly benefited from what is known as attention . Attention is a mechanism that forces the model to learn to focus (=to attend) on specific parts of the input sequence when decoding, instead of relying only on the hidden vector of the decoder’s LSTM.