Open aslucc opened 4 years ago
I figured out the problem was I put my files in the "zoo" repository, If I leave them out I don't need to use build or any other script. Now I am able to start the server anche client processes, but when I chat with the model in browser chat I get responses like:
__null__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__ __unk__
or
__null__ __null__
or
__null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__ __null__
When I chat with the same model with intercative.py it works. (answers with proper words)
Edit:
I noticed that when I start the model with intercative I have in the output
Total parameters: 87,508,992 (87,508,992 trainable)
but when I start the same model with browser chat I have in the output
INFO | No model with opt yet at: covid/covid90M(.opt)
Total parameters: 3,511,200 (2,896,800 trainable)
I don't think you want zoo:covid/covid_90M
. I think you want covid/covid_90M
Thanks, I had figured out that what I wanted was my model to be outside the zoo.
The unk issue was solved using path parlai/covid/covid90M
instead of just covid/covid90M
.
I initially thought that I was supposed to start my path on the same place the zoo was, it still makes no sense to me why it uploaded a 3M model and from where when I used path covid/covid90M
.
Yeah that looks like it initiated a new model from scratch. Our chat services should be preventing that
Leaving this task open because we need to double check that chat services has requireExists=True on the call to create_agent.
This issue has not had activity in 30 days. Please feel free to reopen if you have more issues. You may apply the "never-stale" tag to prevent this from happening.
I would like to uso the broswer chat with my already trained model saved in my host. Actually, my config.yml file is:
I put my model and all the other files (.valid, .dict, .chekpoint, etc.) in a 'covid' repository inside the zoo directory. When I try to run the server with
python3 parlai/chat_service/services/browser_chat/run.py --config-path parlai/chat_service/tasks/covid_task_browser/config.yml --port 10001
get the following error (as i kinda expected):From my understanding, I need to have a build file that will download the model somewhere to get it work... is there a way to just make the config use the model file I have locally? Without downloading it?