utterworks / fast-bert

Super easy library for BERT based NLP models
Apache License 2.0
1.85k stars 342 forks source link

OSError: Can't load config for saved_model when deploying on AWS EC2. #261

Open katreparitosh opened 3 years ago

katreparitosh commented 3 years ago

I was deploying a trained model on AWS EC2 instance (t3a.xlarge) using a dockerized image and Flask. The model was trained using fast-bert.

When I passed a sentence on the rendered page, I received

"In get_config_dict raise EnvironmentError OSError"

and

OSError: Can't load config for 'model/final_model'. Make sure that:
'path/to/final_model' is a correct model identifier listed on 'https://huggingface.co/models'
or 'path/to/final_model' is the correct path to a directory containing a config.json file

As suggested in certain threads, I re-installed the image with the latest transformers==3.3.1 release.

However, I am unable to figure out the issue.

Kindly help.

config_Error