Open ealmazanm opened 2 years ago
Hi,
Since our model is not registered in transformers, you cannot load it using the AutoModel
class.
To load our model, you have to use the BrosModel
class defined in this repository.
Though our model is not registered in transformers, since we upload it to Hugging Face Models, you can use it directly as in the code below
model = BrosModel.from_pretrained("naver-clova-ocr/bros-base-uncased")
Hi, I am trying to load the model using the transformers library of HuggingFace. However I got a KeyError: 'bros' when trying to load the model from_pretrained. Specifically I have the following:
from transformers import AutoModel model = AutoModel.from_pretrained("naver-clova-ocr/bros-base-uncased")
as stated in the doc https://huggingface.co/naver-clova-ocr/bros-base-uncased/tree/main
The full stack error: Traceback (most recent call last): File "", line 1, in
File "/usr/local/envs/fair/lib/python3.9/site-packages/transformers/models/auto/auto_factory.py", line 396, in from_pretraine
config, kwargs = AutoConfig.from_pretrained(
File "/usr/local/envs/fair/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 560, in from_pre
config_class = CONFIG_MAPPING[config_dict["model_type"]]
File "/usr/local/envs/fair/lib/python3.9/site-packages/transformers/models/auto/configuration_auto.py", line 301, in __getite
raise KeyError(key)
Im using huggingface-hub version 0.1.2 and transformers 4.12.5
Any thoughts here? Thanks.