Closed ynusinovich closed 3 years ago
Hey @ynusinovich did you try saving the FARMReadermodel with its save() method?
FARMReader can currently only load huggingface models from HF modelzoo (or local cache dir) not a "normal" local folder.
@Timoeller Thank you so much, that solves the problem!
reader.save("./saved_model/")
in the notebook where I'm using the reader, followed by reader = FARMReader(model_name_or_path="./saved_model/")
in the notebook where I'm loading the reader makes it work correctly!
Nice, glad I could help : )
I am having the same issue but it is not going away. I saved my transformer locally like this:
from haystack.reader.farm import FARMReader
reader = FARMReader(model_name_or_path="deepset/roberta-base-squad2", use_gpu=False).save('./trained_model')
And I am trying to access it in my local flask API with:
reader = FARMReader(model_name_or_path='./trained_model/',
use_gpu=False, return_no_answer=True, no_ans_boost=0.7, context_window_size=200)
But, when I run my api and make a post to the endpoint that loads the reader I am getting this error:
OSError: Can't load config for './trained_model/'. Make sure that:
- './trained_model/' is a correct model identifier listed on 'https://huggingface.co/models'
- or './trained_model/' is the correct path to a directory containing a config.json file
And I noticed that in my ./trained_model
folder there is no config.json
file.
So I copy pasted the config.json
file from https://huggingface.co/deepset/roberta-base-squad2/resolve/main/config.json into my ./trained_model
folder. but it is still throwing the same error.
Can you see what the problem is?
The weird thing is that it was working up to few days ago...
Hey @giuliodz are you sure that you have the correct current directory specified and the folder is residing there?
For sanity you could try saving and loading the model as you posted above outside your API:
from haystack.reader.farm import FARMReader
reader = FARMReader(model_name_or_path="deepset/roberta-base-squad2", use_gpu=False).save('./trained_model')
reader2 = FARMReader(model_name_or_path='./trained_model/',
use_gpu=False, return_no_answer=True, no_ans_boost=0.7, context_window_size=200)
If that works it should be a path problem related to your configs.
@Timoeller Followed exact steps as below the model is not getting saved to trained_model local folder
from haystack.reader.farm import FARMReader
reader = FARMReader(model_name_or_path="deepset/roberta-base-squad2", use_gpu=False).save('./trained_model')
reader2 = FARMReader(model_name_or_path='./trained_model/',
use_gpu=False, return_no_answer=True, no_ans_boost=0.7, context_window_size=200)
Strange technology :see_no_evil: What do you mean by the model is not saved to the folder? Do you get an error message or do you just not see the folder? Which is your current working directory and which Operating System do you use?
You could try specifying the full path instead of a relative path (./trained_model), e.g. on Linux /home/user/foo/bar/trained_model
Describe the bug When I run
reader = FARMReader(model_name_or_path="deepset/xlm-roberta-large-squad2")
, the reader correctly loads. When I runreader = FARMReader(model_name_or_path="./xlm-roberta-large-squad2/")
, where./xlm-roberta-large-squad2/
is the path to a cloned version of thexlm-roberta-large-squad2
model from Huggingface, the reader crashes instead of loading. I'm trying to learn Flask and Docker, so I need to have the model pre-loaded in my Docker container (and load when I point to its path) rather than downloading the 2 GB model every time I run inference in the Flask app.Error message
Expected behavior The
reader
should load when I give thereader
a valid path to the model in my local drive, in the same way that thereader
loads when I give it a model name (after downloading the model).Additional context The Conda environment has the following packages:
To Reproduce
System: