-
**Is your feature request related to a problem? Please describe.**
Related to batch sender in #8122
Currently, queue sender and batch sender work together fine, but the config `sending_queue.…
-
**Description**
Salt has the ability to run something in batches. I noticed that:
1. Minions split per batch is very weird
1. using batch sometimes leads to an exception
**Setup**
I have a …
-
Hi there. I'm wondering, whether it is possible to use `SentenceTransformerTrainer` class for training model, that includes `Asym` module in it's structure?
I'm asking, because due to the document…
-
## Description
## Context
As discussed in #1713
## Possible Implementation
To remove trait.dictionary.csv, probably need to edit the following files (as determined by `$(grep -R 'trait.…
-
-
### 🐛 Describe the bug
DDP init call is failing when using subclass of torch.Tensor, same code works with torch.Tensor.
Command to run the code
python test.py --max-gpus 2 --batch-size 512 --epoch …
-
When I finetune MPT, the code is OK. But when I fine tune Llama I get the following error.
```
----------Begin global rank 2 STDERR----------
2024-09-02 20:12:15,331: rank2[3924][MainThread]: DEBUG…
-
### 问题确认 Search before asking
- [X] 我已经搜索过问题,但是没有找到解答。I have searched the question and found no related answer.
### 请提出你的问题 Please ask your question
配置文件如下
```
use_gpu: true
use_xpu: false
use…
-
Is it possible to remove a metric from a metric repository? I have had a look at the APIs and see no obvious way of doing this. Even if the ability exists to update a metric, i.e. I could add a status…
-
python build.py --hf_model_dir /app/model/Qwen1.5-14B-Chat \
--dtype float16 \
--remove_input_padding \
--use_gemm_plugin float16 \
…