Jun 22, 2017 · batch_size: the size of the batches used during training; the model construction is independent of batch_size, so it can be changed after initialization if this is convenient, e.g., for decoding. learning_rate: learning rate to start with.
21.06.2017 · as I have already commented here the model are you trying to implement is deprecated. If you want to make it working check the code I've pasted in the issue. Starting from tensorflow 1.1 and 1.2 you have the functions for dynamic decode like tf.nn.bidirectional_dynamic_rnn.It allows you to take into account dynamic sized sequences for …
Nov 30, 2020 · AttributeError: 'NoneType' object has no attribute 'from_pretrained' #8864. Closed ... I guess in your situation it has to do with the prepare_seq2seq_batch:
27.07.2020 · After thinking a bit more about this and talking to @sshleifer, I am fine with the PR.I agree now that there are a lot of use cases when prepare_seq2seq_batch is used and since it's going to be added to each model specifically, it's clean as well. Also since it will replace prepare_translation_batch, it does not really increase function exposure.
Jun 01, 2021 · I am trying to use XLNET through transformers. however i keep getting the issue "AttributeError: 'NoneType' object has no attribute 'tokenize'". I am unsure of how to proceed. if anyone could point me in the right direction it would be appreciated.
Aug 28, 2020 · AttributeError: 'MarianTokenizer' object has no attribute 'prepare_translation_batch' Problem 2: When I downloa... Problem 1: When I load tokenizer from local directory. And use it according MarianMT tutorial.
07.08.2021 · HuggingFace, NoneType’ object has no attribute ‘prepare_seq2seq_batch’. August 7, 2021 huggingface-tokenizers, huggingface-transformers, python, translate. I’m trying to execute this python code in order to translate a sentence using HuggingFace transformers. from transformers import MarianTokenizer, MarianMTModel mname = "marefa-nlp ...
30.11.2020 · Yes, this was a bug. Tokenizers are framework-agnostic and should not output a specific framework's tensor. The implementation of the Marian tokenizer was not respecting the API in that regard.
I had the same issue with the latest transformers 4.1 (pip installed). ... AttributeError: 'list' object has no attribute 'to' ". Please help. Thanks, Akila.