Closed sanekun closed 9 months ago
Hello, thank you for pointing this out and suggestions!
Unfortunately, I could not use your code snippet and avoid errors. However, I found another solution that worked for me and ended up with the version that should also do what you requested.
Please, check out the latest version of the program (at least since the commit 07e1464).
When i use local model downloaded in huggingface, Two errors occurred. And i sovled them.
1. In
prottrans_model.py
load_model_and_tokenizer
tokenizer always is getting frompt_server_path
i changed code like below.
prottrans_model.py
get_tokenizer
new version of T5Tokenizer want to get path. Maybe
tokenizer_config.json
is in.I change code like below
i used conda env made by
environment_CPU.yml