Closed aarnphm closed 12 months ago
As a temporary solution, setting the environment variable TRANSFORMERS_OFFLINE=1
and use absolute path that works for me.
As a temporary solution, setting the environment variable
TRANSFORMERS_OFFLINE=1
and use absolute path that works for me.
It is not working for me.
$ HF_DATASETS_OFFLINE=1 TRANSFORMERS_OFFLINE=1 openllm start baichuan --model-id /home/yingjie/openllm/baichuan2-13b --backend pt --debug
Traceback (most recent call last):
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/openllm/serialisation/transformers/init.py", line 147, in get
model = bentoml.models.get(llm.tag)
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/simpledi/init.py", line 139, in
return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/bentoml/models.py", line 45, in get
return _model_store.get(tag)
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/bentoml/_internal/store.py", line 158, in get
raise NotFound(
bentoml.exceptions.NotFound: Model 'pt-baichuan2-13b:08c4d4d5d8625c6702b44beca2570febec83a4ae' is not found in BentoML store <osfs '/root/bentoml/models'>, you may need to run bentoml models pull
first
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/openllm/cli/entrypoint.py", line 416, in import_command
_ref = openllm.serialisation.get(llm)
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/openllm/serialisation/init.py", line 75, in caller
return getattr(importlib.import_module(f'.{serde}', name), fn)(llm, *args, **kwargs)
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/openllm/serialisation/transformers/init.py", line 155, in get
raise openllm.exceptions.OpenLLMException(f'Failed while getting stored artefact (lookup for traceback):\n{err}') from err
openllm_core.exceptions.OpenLLMException: Failed while getting stored artefact (lookup for traceback):
Model 'pt-baichuan2-13b:08c4d4d5d8625c6702b44beca2570febec83a4ae' is not found in BentoML store <osfs '/root/bentoml/models'>, you may need to run bentoml models pull
first
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/openllm/serialisation/transformers/init.py", line 147, in get
model = bentoml.models.get(llm.tag)
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/simpledi/init.py", line 139, in
return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/bentoml/models.py", line 45, in get
return _model_store.get(tag)
File "/home/yingjie/openllm/v_openllm_bc/lib64/python3.8/site-packages/bentoml/_internal/store.py", line 158, in get
raise NotFound(
bentoml.exceptions.NotFound: Model 'pt-baichuan2-13b:08c4d4d5d8625c6702b44beca2570febec83a4ae' is not found in BentoML store <osfs '/root/bentoml/models'>, you may need to run bentoml models pull
first
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/yingjie/openllm/v_openllm_bc/bin/openllm", line 8, in
I believe this has been addressed. The Baichuan issue will be tracked separately. cc @larme
Describe the bug
Somewhere along the way with the tag refactoring,
--model-id
failed to load model from local path.We have an internal tracking issue on reworking the tag generation, which will address this problem.
cc @larme
To reproduce
No response
Logs
No response
Environment
Persistent throughout different python version.
System information (Optional)
No response