Open yousofaly opened 3 months ago
Hi,
Please copy tokenizer = model.text.tokenizer
from line 103 to line 104, ensuring the correct indentation. However, we recommend using the checkpoint for chat, as not loading the checkpoint may result in meaningless output.
per the readme, the checkpoints are being automatically downloaded from huggingface, correct?
when running
python lhrs_webui.py -c Config/multi_modal_eval.yaml --server-port 8000 --server-name 127.0.0.1 --share
I see this error
That's because I am trying to run the demo with the default checkpoint path path by leaving it blank. But leaving it blank will result in the if statement on line 88 here to not run.
how can I run the demo by automatically downloading the models from hugging face?