seq2seq-pytorch · GitHub Topics · GitHub
github.com › topics › seq2seq-pytorchA PyTorch implementation of the hierarchical encoder-decoder architecture (HRED) introduced in Sordoni et al (2015). It is a hierarchical encoder-decoder architecture for modeling conversation triples in the MovieTriples dataset. This version of the model is built for the MovieTriples dataset. nlp deep-learning pytorch hred seq2seq-pytorch
GitHub - thu-coai/seq2seq-pytorch-bert
github.com › thu-coai › seq2seq-pytorch-bertSeq2Seq-BERT -- a pytorch implementation Seq2seq with attention mechanism is a basic model for single turn dialog. In addition, batch normalization and dropout has been applied. You can also choose beamsearch, greedy, random sample, random sample from top k when decoding. BERT is a widely-used pretrained language model. We use it as encoder.