Closed raphael10-collab closed 3 years ago
Hi,
this is just a remark by the Huggingface library - no need to worry. We are using the BERT implementation of Huggingface internally. You are doing everything correctly here. When executing the train code (as you do), you train JEREX (and fine-tune into BERT) on a down-stream task (end-to-end relation extraction) and you can then use the model for prediction.
Thank you @markus-eberts .
Now I've got this memory
issue: https://github.com/lavis-nlp/jerex/issues/3
Does anyone know if there's a way to hide this message? :)
Hi, you should be able to suppress messages by decreasing the logging verbosity as described in their documentation: https://huggingface.co/docs/transformers/main_classes/logging
When trying to train the model with
python ./jerex_train.py --config-path configs/docred_joint
I get this messageYou should probably TRAIN this model on a down-stream task to be able to use it for predictions and inference
. What does it mean? How should I then train the model instead?O.S: Ubuntu 18.04.4