-
### 你是否已经阅读并同意《Datawhale开源项目指南》?
- [X] 我已阅读并同意[《Datawhale开源项目指南》](https://github.com/datawhalechina/DOPMC/blob/main/GUIDE.md)
### 你是否已经阅读并同意《Datawhale开源项目行为准则》?
- [X] 我已阅读并同意[《Datawhale开源项目行为…
-
# 누구나 알기 쉬운 트랜스포머와 어텐션 살펴보기
인공지능을 약…
[https://kakaotech-harmony.netlify.app/ai/transformer/](https://kakaotech-harmony.netlify.app/ai/transformer/)
-
It would be nice to have the transformer objects in Torch.
Actually, I have already implemented most of them (by translating the pytorch code https://github.com/pytorch/pytorch/blob/main/torch/nn/…
-
Huggingface Transformer Support
-
### Description
When using the new plugins config, I am unable to use the date transformer without also creating the transformer functions. I only need the types for my use case. This was possible in…
-
Hi, I would like to pre-train the model myself to gain a better understanding of machine learning models.
Specifically, **could you provide the code that was used to pre-train the v2 500m multi-sp…
-
The format for loading a model into the code is Pt.
I got the following error while executing the inference code.
How to solve it?
![image](https://github.com/user-attachments/assets/0b8cab91-34b…
-
Hello team:
I have two questions. How can we adjust the hierarchy of multiple videos and support the second video to start playing at the specified time of the first video
-
### Describe the bug
Is it possible to get back the `attention_mask` argument in the flux attention processor
```
hidden_states = F.scaled_dot_product_attention(query, key, value, dropout_p=0.…
-
### 🚀 The feature, motivation and pitch
1. [Exphormer: Sparse Transformers for Graphs](https://arxiv.org/abs/2303.06147)
2. [SGFormer: Simplifying and Empowering Transformers for Large-Graph Represe…