bentoml / OpenLLM

Run any open-source LLMs, such as Llama 3.1, Gemma, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
9.7k stars 616 forks source link

bug: WARNING: openllm 0.4.44 does not provide the extra 'gemma' #965

Closed infinite-Joy closed 3 months ago

infinite-Joy commented 4 months ago

Describe the bug

Not able to download and run gemma models

To reproduce

  1. pip install "openllm[gemma]"
  2. TRUST_REMOTE_CODE=True openllm start google/gemma-7b

Logs

First while installation the below warning comes

WARNING: openllm 0.4.44 does not provide the extra 'gemma'

And while downloading I get the below stack trace

special_tokens_map.json: 100%|████████████████████████████████████████████████████████| 636/636 [00:00<00:00, 2.76MB/s]
Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/fs/osfs.py", line 647, in open
    return io.open(
FileNotFoundError: [Errno 2] No such file or directory: b'/root/bentoml/models/vllm-google--gemma-7b/latest'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/bentoml/_internal/store.py", line 144, in get
    _tag.version = self._fs.readtext(_tag.latest_path())
  File "/opt/conda/lib/python3.10/site-packages/fs/base.py", line 693, in readtext
    self.open(
  File "/opt/conda/lib/python3.10/site-packages/fs/osfs.py", line 643, in open
    with convert_os_errors("open", path):
  File "/opt/conda/lib/python3.10/site-packages/fs/error_tools.py", line 89, in __exit__
    reraise(fserror, fserror(self._path, exc=exc_value), traceback)
  File "/opt/conda/lib/python3.10/site-packages/six.py", line 718, in reraise
    raise value.with_traceback(tb)
  File "/opt/conda/lib/python3.10/site-packages/fs/osfs.py", line 647, in open
    return io.open(
fs.errors.ResourceNotFound: resource 'vllm-google--gemma-7b/latest' not found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/bentoml/_internal/store.py", line 118, in _recreate_latest
    items = self.list(tag.name)
  File "/opt/conda/lib/python3.10/site-packages/bentoml/_internal/store.py", line 95, in list
    raise NotFound(
bentoml.exceptions.NotFound: no Models with name 'vllm-google--gemma-7b' found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/lib/python3.10/site-packages/openllm/_llm.py", line 223, in __init__
    model = bentoml.models.get(self.tag)
  File "/opt/conda/lib/python3.10/site-packages/simple_di/__init__.py", line 139, in _
    return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
  File "/opt/conda/lib/python3.10/site-packages/bentoml/models.py", line 45, in get
    return _model_store.get(tag)
  File "/opt/conda/lib/python3.10/site-packages/bentoml/_internal/store.py", line 149, in get
    self._recreate_latest(_tag)
  File "/opt/conda/lib/python3.10/site-packages/bentoml/_internal/store.py", line 120, in _recreate_latest
    raise NotFound(
bentoml.exceptions.NotFound: no Models with name 'vllm-google--gemma-7b' exist in BentoML store <osfs '/root/bentoml/models'>

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/opt/conda/bin/openllm", line 8, in <module>
    sys.exit(cli())
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/opt/conda/lib/python3.10/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/openllm_cli/entrypoint.py", line 160, in wrapper
    return_value = func(*args, **attrs)
  File "/opt/conda/lib/python3.10/site-packages/click/decorators.py", line 33, in new_func
    return f(get_current_context(), *args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/openllm_cli/entrypoint.py", line 141, in wrapper
    return f(*args, **attrs)
  File "/opt/conda/lib/python3.10/site-packages/openllm_cli/entrypoint.py", line 366, in start_command
    llm = openllm.LLM(
  File "/opt/conda/lib/python3.10/site-packages/openllm/_llm.py", line 225, in __init__
    model = openllm.serialisation.import_model(self, trust_remote_code=self.trust_remote_code)
  File "/opt/conda/lib/python3.10/site-packages/openllm/serialisation/__init__.py", line 63, in caller
    return getattr(importlib.import_module(f'.{serde}', 'openllm.serialisation'), fn)(llm, *args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/openllm/serialisation/transformers/__init__.py", line 33, in import_model
    with save_model(
  File "/opt/conda/lib/python3.10/contextlib.py", line 135, in __enter__
    return next(self.gen)
  File "/opt/conda/lib/python3.10/site-packages/openllm/serialisation/_helpers.py", line 126, in save_model
    labels=openllm.utils.generate_labels(llm),
  File "/opt/conda/lib/python3.10/site-packages/openllm/utils.py", line 10, in generate_labels
    'model_name': llm.config['model_name'],  #
  File "/opt/conda/lib/python3.10/site-packages/openllm/_llm.py", line 501, in config
    config = openllm.AutoConfig.infer_class_from_llm(self).model_construct_env(**self._model_attrs)
  File "/opt/conda/lib/python3.10/site-packages/openllm_core/config/configuration_auto.py", line 211, in infer_class_from_llm
    raise ValueError(f"Failed to determine config class for '{llm.model_id}'. Make sure {llm.model_id} is saved with openllm.")
ValueError: Failed to determine config class for 'google/gemma-7b'. Make sure google/gemma-7b is saved with openllm.

Environment

bentoml==1.1.11 transformers==4.39.3 python==3.10.13 platform Ubuntu 22.04.3 LTS

System information (Optional)

memory 241591 platform Ubuntu 22.04.3 LTS architecture x86_64 CPU 64

aarnphm commented 3 months ago

gemma is supported by default with transformers, so there is no ned to use TRUST_REMOTE_CODE=True