Closed specblades closed 1 month ago
Can your manual download adapter_config.json
from Llama-3.2-11b-vision-uncensored(modelscope)
Then replace it under repo path models/Llama-3.2-11b-vision-Instruct/patch
.
Retry with same command, see whether this error reproduced.
Emmm. I wroten wrong url in my json config. So if you use url method download, if will return wrong url for llm patches. I will fix this error soon.
Hi Trying to install repo, but falling into error on applying LLM patch. Help me out pls
I ran the script with the command: "python caption.py D:\DATASETS\equals --model_site modelscope --download_method URL --wd_force_use_cpu --wd_remove_underscore --llm_patch"