bentoml / OpenLLM

Run any open-source LLMs, such as Llama, Mistral, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
10.12k stars 641 forks source link

bug: `TypeError: Argument() missing 1 required positional argument: 'default'` #1062

Closed djmbritt closed 3 months ago

djmbritt commented 3 months ago

Describe the bug

Any time I try to run the openllm command, I get a Typeerror: ....

To reproduce

When I try to run any openllm command, it fails. So even openllm --version or openllm hello fails.

Logs

Traceback (most recent call last):
  File "/home/djmbritt/.local/bin/openllm", line 5, in <module>
    from openllm.__main__ import app
  File "/home/djmbritt/.local/lib/python3.10/site-packages/openllm/__main__.py", line 209, in <module>
    model: Annotated[str, typer.Argument()] = '', repo: Optional[str] = None, port: int = 3000, verbose: bool = False
TypeError: Argument() missing 1 required positional argument: 'default'

### Environment

#### Environment variable

```bash
BENTOML_DEBUG=''
BENTOML_QUIET=''
BENTOML_BUNDLE_LOCAL_BUILD=''
BENTOML_DO_NOT_TRACK=''
BENTOML_CONFIG=''
BENTOML_CONFIG_OPTIONS=''
BENTOML_PORT=''
BENTOML_HOST=''
BENTOML_API_WORKERS=''

System information

bentoml: 1.3.1 python: 3.10.12 platform: Linux-6.9.3-76060903-generic-x86_64-with-glibc2.35 uid_gid: 1000:1000

pip_packages
``` woeusb-ng==0.2.12 ```

System information (Optional)

uname --all

Linux pop-os 6.9.3-76060903-generic #202405300957~1721174657~22.04~abb7c06 SMP PREEMPT_DYNAMIC Wed J x86_64 x86_64 x86_64 GNU/Linux

bojiang commented 3 months ago

It relates to typer 0.7.0: https://github.com/fastapi/typer/discussions/656

pip install typer -U should fix that.

djmbritt commented 3 months ago

That fixed it, thank you.