Closed ahmed-aleroud closed 2 years ago
Salam alukum ya Ahmed, thanks for your interest in our work. I believe this is because you may not have used the same dependencies as described in the ReadMe and requirements files. I suggest that you check those files and tried again the code. I also noticed that you are using Python 3.9, whereas I used 3.6! Hope this helps, Hassan
Salam Hassan
I was trying to run your code for SpanEmo paper. Interesting work by the way. However, I am not sure why when I try to run it on an Arabic dataset, it gives me the following errors and warnings.
AttributeError: 'NoneType' object has no attribute 'update'
Details are below
It looks like it is something related to fastprogress, but I tried every single possibility including upgrading and downgrading some libraries
Any help is appreciated
Thanks
usr/local/lib/python3.9/site-packages/google/colab/data_table.py:30: UserWarning: IPython.utils.traitlets has moved to a top-level traitlets package. from IPython.utils import traitlets as _traitlets Currently using GPU: cuda:0 /usr/local/lib/python3.9/site-packages/ekphrasis/classes/tokenizer.py:225: FutureWarning: Possible nested set at position 2190 self.tok = re.compile(r"({})".format("|".join(pipeline))) Reading twitter_2018 - 1grams ... Reading twitter_2018 - 2grams ... /usr/local/lib/python3.9/site-packages/ekphrasis/classes/exmanager.py:14: FutureWarning: Possible nested set at position 42 regexes = {k.lower(): re.compile(self.expressions[k]) for k, v in Reading twitter_2018 - 1grams ... PreProcessing dataset ...: 0% 0/178 [00:00<?, ?it/s]/usr/local/lib/python3.9/site-packages/transformers/tokenization_utils_base.py:2323: FutureWarning: The
pad_to_max_length
argument is deprecated and will be removed in a future version, usepadding=True
orpadding='longest'
to pad to the longest sequence in the batch, or usepadding='max_length'
to pad to a max length. In this case, you can give a specific length withmax_length
(e.g.max_length=45
) or leave max_length to None to pad to the maximal input size of the model (e.g. 512 for Bert). warnings.warn( PreProcessing dataset ...: 100% 178/178 [00:01<00:00, 122.84it/s] The number of training batches: 6 Reading twitter_2018 - 1grams ... Reading twitter_2018 - 2grams ... Reading twitter_2018 - 1grams ... PreProcessing dataset ...: 0% 0/178 [00:00<?, ?it/s]/usr/local/lib/python3.9/site-packages/transformers/tokenization_utils_base.py:2323: FutureWarning: Thepad_to_max_length
argument is deprecated and will be removed in a future version, usepadding=True
orpadding='longest'
to pad to the longest sequence in the batch, or usepadding='max_length'
to pad to a max length. In this case, you can give a specific length withmax_length
(e.g.max_length=45
) or leave max_length to None to pad to the maximal input size of the model (e.g. 512 for Bert). warnings.warn( PreProcessing dataset ...: 100% 178/178 [00:01<00:00, 89.23it/s] The number of validation batches: 6 Some weights of the model checkpoint at asafaya/bert-base-arabic were not used when initializing BertModel: ['cls.predictions.bias', 'cls.predictions.transform.LayerNorm.weight', 'cls.predictions.decoder.weight', 'cls.predictions.transform.dense.bias', 'cls.predictions.decoder.bias', 'cls.predictions.transform.LayerNorm.bias', 'cls.predictions.transform.dense.weight']no_deprecation_warning=True
to disable this warning warnings.warn( ahmed aleroud