Closed mattdangerw closed 1 year ago
I would love to work on this issue!
/assign
Hi @soma2000-lang are you still working on it? if not then I would take this one up !
@shivance yes ,I had also opened a pr for it,but had to close it for some reason.Will reopen another one.
Hey! I would like to take this issue up, if others are not @soma2000-lang
@jayam30 already opened a pr
Hi @mattdangerw , it's already been 1month+ since this issue has gone stale. Please assign me this issue, I'm already having a crack at it to see how challenging it is. So far the tests pass in local.
If you allow, I could open a PR. Thanks !
@shivance I am working on it but the tests are failing currently ,working on that.
In https://github.com/keras-team/keras-nlp/pull/653 we added a masked language modeling task for RoBERTa. We can make a similar change for the
XLMRoberta
model.XLMRobertaTokenizer
to expect a mask token.XLMRobertaMaskedLMPreprocessor
preprocessor layer and tests.XLMRobertaMaskedLM
task model and tests.keras_nlp/models/__init__.py
to exportXLMRobertaMaskedLM
andXLMRobertaMaskedLMPreprocessor
.