bentoml / OpenLLM

Run any open-source LLMs, such as Llama, Gemma, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
10.05k stars 636 forks source link

bug: `openllm start dolly-v2` errors out with AttributeError: 'DollyV2Config' object has no attribute 'default_prompt_template' #498

Closed wjthompso closed 1 year ago

wjthompso commented 1 year ago

Describe the bug

After running openllm build dolly-v2 to download the dolly-v2 model, I attempted to run the command openllm start dolly-v2 and got the following error:

I also got this same default_prompt_template issue with the opt model.

  File "<frozen runpy>", line 198, in _run_module_as_main
  File "<frozen runpy>", line 88, in _run_code
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/bentoml/__main__.py", line 4, in <module>
    cli()
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/click/core.py", line 1157, in __call__
    return self.main(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/click/core.py", line 1078, in main
    rv = self.invoke(ctx)
         ^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/click/core.py", line 1688, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
    return ctx.invoke(self.callback, **ctx.params)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/click/core.py", line 783, in invoke
    return __callback(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/bentoml_cli/utils.py", line 362, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/bentoml_cli/utils.py", line 311, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/click/decorators.py", line 33, in new_func
    return f(get_current_context(), *args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/bentoml_cli/utils.py", line 290, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/bentoml_cli/env_manager.py", line 122, in wrapper
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/bentoml_cli/serve.py", line 260, in serve
    serve_http_production(
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/simple_di/__init__.py", line 139, in _
    return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/bentoml/serve.py", line 286, in serve_http_production
    svc = load(bento_identifier, working_dir=working_dir)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/bentoml/_internal/service/loader.py", line 374, in load
    svc = import_service(
          ^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/simple_di/__init__.py", line 139, in _
    return func(*_inject_args(bind.args), **_inject_kwargs(bind.kwargs))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/bentoml/_internal/service/loader.py", line 137, in import_service
    module = importlib.import_module(module_name, package=working_dir)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1206, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1178, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1149, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 690, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 940, in exec_module
  File "<frozen importlib._bootstrap>", line 241, in _call_with_frames_removed
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/openllm/_service.py", line 35, in <module>
    runner = openllm.Runner(model, llm_config=llm_config, model_id=model_id, ensure_available=False, adapter_map=orjson.loads(adapter_map))
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/openllm/_llm.py", line 1131, in Runner
    runner = infer_auto_class(backend).create_runner(model_name, llm_config=llm_config, ensure_available=ensure_available, **attrs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/openllm/models/auto/factory.py", line 71, in create_runner
    return cls.for_model(model, model_id=model_id, **attrs).to_runner(**runner_attrs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/openllm/_llm.py", line 883, in to_runner
    return llm_runner_class(self)(llm_runnable_class(self, embeddings_sig, generate_sig, generate_iterator_sig),
           ^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/openllm/_llm.py", line 1328, in llm_runner_class
    return types.new_class(self.__class__.__name__ + 'Runner', (bentoml.Runner,),
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/types.py", line 72, in new_class
    exec_body(ns)
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/openllm/_llm.py", line 1348, in <lambda>
    'prompt_template': self._prompt_template.to_string() if self._prompt_template else self.config.default_prompt_template,
                                                                                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File ".../open_llm_toolbox/.conda_env/lib/python3.11/site-packages/openllm_core/_configuration.py", line 1281, in __getattribute__
    return _object_getattribute.__get__(self)(item)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: 'DollyV2Config' object has no attribute 'default_prompt_template'

🚀 Next step: run 'openllm build dolly_v2' to create a Bento for dolly_v2

To reproduce

I am having this issue in a conda environment. Here is my environment.yml

channels:
  - conda-forge
  - defaults
dependencies:
  - python=3.11.3
  - pip
  - pip:
      - openllm

Logs

No response

Environment

python: 3.11.3 conda: 23.9.0 openllm: 0.3.7

System information (Optional)

Macbook 2021 M1 Max Chip

troyyyang commented 1 year ago

Running into the same issue following the example in the docs. Tried OPT as well to no avail.

linkedlist771 commented 1 year ago

So am I.

med-aymen-jelassi commented 1 year ago

same issue :/ default_prompt_template

aarnphm commented 1 year ago

This has been fixed with 0.3.9. Sorry for the trouble.

aarnphm commented 1 year ago

I'm going to close this issue for now, but please feel free to reopen if you have any further questions or concerns. Thanks!