-
- 制定LLM 相關主題
-
-
Is it possible to integrate with Anything LLM?
-
### System Info
Hi,
I'm having trouble reproducing NVidia claimed numbers in the table here: https://nvidia.github.io/TensorRT-LLM/performance/perf-overview.html#throughput-measurements
System Im…
-
### 🚀 The feature, motivation and pitch
It is common to have a scenario where folks want to deploy multiple vLLM instances on a single machine due to the machine have several GPUs (commonly 8 GPUs). …
-
**Is your feature request related to a problem? Please describe.**
Recipient tool is of course critical to the utility of Langroid. While ChatGPT produces picture perfect handling of the recipient to…
-
In the code-snippet below, is it possible to load Decoder/Encoder with pre-trained models from huggingface hub?
```
augment_llm = TransformerWrapper(
num_tokens = 20000,
max_seq_len = 10…
-
### Area(s)
area:gen-ai, llm
### Is your change request related to a problem? Please describe.
Continuation of https://github.com/open-telemetry/semantic-conventions/issues/1007
To prevent…
-
The llms.txt initiative introduces a standardized markdown.
I don't know if we should follow this format
[LLM Txt](https://llmstxt.org/)
-
### Your current environment
The output of `python collect_env.py`
```text
python collect_env.py
Collecting environment information...
2024-09-23 17:57:46.577274: I tensorflow/core/util/po…
-
### What happened?
Using config
```yaml
model_list:
- model_name: bge-large-en-v1.5
litellm_params:
model: huggingface/BAAI/bge-large-en-v1.5
api_base: http://localhost:80…