-
hi how to get llama-3.2 to work with ipex_llm ?
here's my code.
```
import requests
import torch
from PIL import Image
from transformers import MllamaForConditionalGeneration, AutoProcessor
imp…
-
### Question
大佬您好,最近拜读了你的Relation DETR 收获很多,现在有几个问题想请教一下:
1.在论文图1中,您用MC指标展示了不同数据集下目标框的位置关系,以此说明位置的统计意义。可以发现特定任务数据集的指标更靠右。我想问下是否意味着您这种方法更适合用于特定数据集呢?您是否在ESD、CSD、MSSD这些数据集上试验过呢,提升会更明显吗?
2.公式8中为何要使用M…
-
A full explanation of and a minimal case for reproducing the problem can be found here: https://github.com/Lichborne/IdrisLinearityIssue/blob/main/ErrorExamples.idr
Depends only on Data.Linear.LVect…
-
![Screenshot1](https://github.com/user-attachments/assets/41a0d22d-4a4c-4d20-954d-70cf83b58e2c)
The canny is not even working at all. Please find the workflow attached, Am I doing something wrong?
…
LiJT updated
3 months ago
-
**Describe the bug**
I'm running into the following error while trying to train with bert-large-uncased or roberta-large models,
although bert-base-uncased or roberta-base models were trained succes…
-
Hello,
I am very interested in your research and am currently trying to run some experiments based on it. However, I encountered an issue while running the program from the HuggingFace_EncDec directo…
-
in the model.py:
```python
self.encoder_layer = nn.TransformerEncoderLayer(d_model=feature_size, nhead=7, dropout=dropout)
self.transformer_encoder = nn.TransformerEncoder(self.en…
-
## 🚀 Feature
We can reduce the number of parameters by sharing the linear projection layers for queries and keys in the nn.Transformer and nn.activation.MultiheadAttention modules.
I think we …
-
### Describe the bug
When specifying `num_layers` in `SD3ControlNetModel.from_transformer`, the function results in an error of unexpected key(s). I think this is because that in `SD3ControlNetModel`…
-
### Describe the bug
I trained a lora with simpletuner using ai-toolkit preset (I used all+ffs and others and it doesnt train correctly on hard concepts).
And Now I have this issue when loading the …