issues
search
bentoml
/
OpenLLM
Run any open-source LLMs, such as Llama 3.1, Gemma, as OpenAI compatible API endpoint in the cloud.
https://bentoml.com
Apache License 2.0
9.7k
stars
616
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
chore(deps): bump vllm from 0.4.0 to 0.4.1 in /openllm-python
#969
dependabot[bot]
closed
4 months ago
1
For AMD/GPU, how to use multi GPUS in the api_server.py
#968
williamZQ
closed
1 month ago
2
bug: error coming up while install the vllm using pip install "openllm[vllm]"
#967
Developer-atomic-amardeep
closed
1 month ago
1
feat: support LMDeploy backend
#966
zhyncs
closed
1 month ago
7
bug: WARNING: openllm 0.4.44 does not provide the extra 'gemma'
#965
infinite-Joy
closed
3 months ago
1
Update README.md
#964
parano
closed
4 months ago
0
chore(deps): bump taiki-e/install-action from 2.32.9 to 2.33.3
#963
dependabot[bot]
closed
4 months ago
1
feat: Can you support llama3?
#961
ICLXL
closed
3 months ago
3
bug: Not enough data for satisfy transfer length header
#960
Cherchercher
closed
1 month ago
3
ci: pre-commit autoupdate [pre-commit.ci]
#959
pre-commit-ci[bot]
closed
4 months ago
0
chore(deps): bump pypa/gh-action-pypi-publish from 1.8.11 to 1.8.14
#958
dependabot[bot]
closed
4 months ago
0
chore(deps): bump taiki-e/install-action from 2.32.9 to 2.32.17
#957
dependabot[bot]
closed
4 months ago
1
chore(deps): bump docker/metadata-action from 5.5.0 to 5.5.1
#956
dependabot[bot]
closed
4 months ago
0
chore(deps): bump actions/setup-python from 5.0.0 to 5.1.0
#955
dependabot[bot]
closed
4 months ago
0
chore(deps): bump sigstore/cosign-installer from 3.4.0 to 3.5.0
#954
dependabot[bot]
closed
4 months ago
0
chore(deps): bump vllm from 0.4.0 to 0.4.0.post1 in /openllm-python
#953
dependabot[bot]
closed
4 months ago
1
bug: An exception occurred while instantiating runner 'llm-mistral-runner'
#952
billy-le
closed
3 months ago
2
feat: any plan to support NPU
#951
zhangxinyang97
opened
4 months ago
1
feat(amd): support ROCm detection
#950
younseojava
closed
3 months ago
2
docs: Update high-level messaging
#949
Sherlock113
closed
4 months ago
0
feat: support Qwen1.5
#948
sudazzk
closed
1 month ago
2
ci: pre-commit autoupdate [pre-commit.ci]
#947
pre-commit-ci[bot]
closed
4 months ago
0
chore(deps): bump aquasecurity/trivy-action from 0.18.0 to 0.19.0
#946
dependabot[bot]
closed
4 months ago
0
chore(deps): bump taiki-e/install-action from 2.27.9 to 2.32.9
#945
dependabot[bot]
closed
4 months ago
0
feat: c4ai-command-r-v01 support
#944
0x77dev
closed
3 months ago
3
fix(compat): use annotated type from `typing_compat`
#943
rudeigerc
closed
5 months ago
1
ci: pre-commit autoupdate [pre-commit.ci]
#942
pre-commit-ci[bot]
closed
5 months ago
0
chore(deps): bump docker/setup-buildx-action from 3.0.0 to 3.2.0
#941
dependabot[bot]
closed
5 months ago
0
chore(deps): bump actions/checkout from 4.1.1 to 4.1.2
#940
dependabot[bot]
closed
4 months ago
1
chore(deps): bump github/codeql-action from 3.24.3 to 3.24.9
#939
dependabot[bot]
closed
5 months ago
0
Fixes Deprecation warning for PyTorch
#938
shubh1777
closed
3 months ago
0
FileNotFoundError: [Errno 2] No such file or directory: b'/root/bentoml/models/pt-google--gemma-7b-it/latest'
#937
PaoloBarba
closed
1 month ago
0
Deprecation Warning for PyTorch Backend
#936
byteshiva
closed
3 months ago
2
ci: pre-commit autoupdate [pre-commit.ci]
#935
pre-commit-ci[bot]
closed
5 months ago
0
Deploying LLM in On-Premises Server to Assist Users to Launch Locally in Work Laptop - Web Browser
#934
sanket038
opened
5 months ago
3
chore(deps): bump marocchino/sticky-pull-request-comment from 2.8.0 to 2.9.0
#933
dependabot[bot]
closed
5 months ago
0
chore(deps): bump aquasecurity/trivy-action from 0.16.1 to 0.18.0
#932
dependabot[bot]
closed
5 months ago
0
ci: pre-commit autoupdate [pre-commit.ci]
#931
pre-commit-ci[bot]
closed
5 months ago
0
feat: support volta architecture GPUs for the vLLM backend
#930
K-Mistele
opened
5 months ago
0
I'm having trouble getting statted with openllm, but I don't want to use conda and I have WSL2
#929
Lightwave234
opened
5 months ago
1
chore(deps-dev): update bitsandbytes requirement from <0.42 to <0.44 in /openllm-python
#928
dependabot[bot]
closed
4 months ago
1
bug: start chatglm-6b locally err
#926
zhangxinyang97
closed
1 month ago
1
chore(deps-dev): bump vllm from 0.2.7 to 0.3.3 in /openllm-python
#925
dependabot[bot]
closed
4 months ago
2
bug: not generate eos_token when using qwen7B-chat
#924
qiufengyuyi
closed
1 month ago
1
feature: Add JAIS Models
#923
aliozts
closed
3 months ago
1
chore(deps): bump taiki-e/install-action from 2.26.18 to 2.27.9
#920
dependabot[bot]
closed
5 months ago
1
chore(deps): bump aws-actions/configure-aws-credentials from 4.0.1 to 4.0.2
#919
dependabot[bot]
closed
5 months ago
1
chore(deps-dev): bump vllm from 0.2.7 to 0.3.2 in /openllm-python
#918
dependabot[bot]
closed
6 months ago
1
chore(deps-dev): bump ray from 2.6.0 to 2.9.3 in /openllm-python
#917
dependabot[bot]
closed
6 months ago
1
fix: typo of cli
#916
Zerohertz
closed
3 months ago
0
Previous
Next