I was deploying a trained model on AWS EC2 instance (t3a.xlarge) using a dockerized image and Flask. The model was trained using fast-bert.
When I passed a sentence on the rendered page, I received
"In get_config_dict raise EnvironmentError OSError"
and
OSError: Can't load config for 'model/final_model'. Make sure that:
'path/to/final_model' is a correct model identifier listed on 'https://huggingface.co/models'
or 'path/to/final_model' is the correct path to a directory containing a config.json file
As suggested in certain threads, I re-installed the image with the latest transformers==3.3.1 release.
I was deploying a trained model on AWS EC2 instance (t3a.xlarge) using a dockerized image and Flask. The model was trained using fast-bert.
When I passed a sentence on the rendered page, I received
"In get_config_dict raise EnvironmentError OSError"
and
As suggested in certain threads, I re-installed the image with the latest transformers==3.3.1 release.
However, I am unable to figure out the issue.
Kindly help.