mickeysjm / R-BERT

Pytorch re-implementation of R-BERT model
GNU General Public License v3.0
66 stars 15 forks source link

Problem with special tokens #1

Open pvcastro opened 4 years ago

pvcastro commented 4 years ago

Hi there!

How did you handle those special tokens? additional_special_tokens = ["[E11]", "[E12]", "[E21]", "[E22]"]

Just passing them as an 'additional_special_tokens' parameter to the BertTokenizer.from_pretrained doesn't seem to have any effect. When we actually tokenize the texts, these special tokens get tokenized too:

['[', 'e', '##11', ']', 'tom', 'tha', '##bane', '[', 'e', '##12', ']', 'resigned', 'in', 'october', 'last', 'year', 'to', 'form', 'the', '[', 'e', '##21', ']', 'all', 'bas', '##otho', 'convention', '[', 'e', '##22', ']', '(', 'abc', ')', ',', 'crossing', 'the', 'floor', 'with', '17', 'members', 'of', 'parliament', ',', 'causing', 'constitutional', 'monarch', 'king', 'lets', '##ie', 'iii', 'to', 'dissolve', 'parliament', 'and', 'call', 'the', 'snap', 'election', '.']

The other repository you used as a reference seemed to have an issue with this too: https://github.com/wang-h/bert-relation-classification/issues/4

I'm trying to manually call tokenizer.add_special_tokens({'additional_special_tokens': additional_special_tokens}) But when do_lower_case is true, the tokens get lowercased as well, and they hit UNK when converting to ids.

Thanks!

mickeysjm commented 4 years ago

Hi @pvcastro,

Which version of the transformer library are you using? In my local environment with Transformer version 2.8.0, the tokenizer works fine. I put a screen below for your reference. tokenizer

pvcastro commented 4 years ago

Strange @mickeystroller , I'm doing the exact same thing as you are, but take a look at my results

image

mickeysjm commented 4 years ago

Indeed it looks strange. I don't know what happens here. Maybe you can restart the IPython kernel and run this pipeline from scratch again?

pvcastro commented 4 years ago

Same problem. Can you tell me the version of your tokenizers package? pip show tokenizers

mickeysjm commented 4 years ago

My tokenizer library version is 0.5.2

pvcastro commented 4 years ago

Strange, I tried downgrading to 0.5.2, and even though it installed correctly, importing transformer with it doesn't work:

image

pvcastro commented 4 years ago

Maybe you have a cached tokenizer with the additional special tokens saved to it? :thinking: There's nothing in BertTokenizer.from_pretrained that causes these tokens to be permanently attached to the tokenizer. Here's what the super class does with them:

image

It only performs an assert, it doesn't save them anywhere.

mickeysjm commented 4 years ago

I don't think I cache the tokenizer intentionally. Maybe the transformer library automatically do that? I think you can ask this question in the transformer library and get some supports from the developer of the transformer library. If you figure it out later, please kindly let me know.

Thanks.

pvcastro commented 4 years ago

I'll do that @mickeystroller . Would you mind running a transformers-cli env so I can add this information to the issue? They require this.

mickeysjm commented 4 years ago

Below is the transformers-cli env output:

pvcastro commented 4 years ago

Thanks! Here's the opened issue: https://github.com/huggingface/transformers/issues/4229

pvcastro commented 4 years ago

Hi @mickeystroller , how are you? One week since I opened the issue, and no replies from the transformers team yet. Do you mind creating a brand new conda environment, installing the latest transformers package and run this same simple test?

import transformers
from transformers import BertTokenizer
additional_special_tokens = ["[E11]", "[E12]", "[E21]", "[E22]"]
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased', do_lower_case=True, additional_special_tokens=additional_special_tokens)
test_string = '[E11] Tom Thabane [E12] resigned in October last year to form the [E21] All Basotho Convention [E22] -LRB- ABC -RRB- , crossing the floor with 17 members of parliament , causing constitutional monarch King Letsie III to dissolve parliament and call the snap election .'
tokenizer.tokenize(test_string)