-
- https://arxiv.org/abs/2107.02192
- 2021
トランスフォーマーは、言語領域と視覚領域の両方で成功を収めている。
しかし、長い文書や高解像度の画像のような長いシーケンスに拡張するには、自己保持機構が入力シーケンスの長さに対して二次的な時間とメモリの複雑さを持つため、法外なコストがかかります。
本論文では、言語タスクと視覚タスクの両方において、長いシ…
e4exp updated
3 years ago
-
Thank you for taking the time to review my question.
Before I proceed, I would like to mention that I am a beginner, and I would appreciate your consideration of this fact.
I am seeking assistan…
-
Original Repository: https://github.com/ml-explore/mlx-examples/
Listing out examples from there which would be nice to have. We don't expect the models to work out the moment they are translated to …
-
The paper https://arxiv.org/abs/quant-ph/0406176 introduces an algorithm for performing quantum shannon decomposition. We have this algorithm implemented in Cirq in https://github.com/quantumlib/Cirq/…
-
Hi,
I've found your paper "Parameter Efficient Fine-Tuning of Pre-trained Code Models for Just-in-Time Defect Prediction." I'm trying to reproduce your results with CodeReviewer. Still, I came up…
-
I'd love to see Sentence Transformers getting added into Hoarder for enhanced semantic search capabilities. It could make finding bookmarks much more efficient and user-friendly.
For reference, yo…
-
There is a typo here:
https://github.com/alimama-creative/FLUX-Controlnet-Inpainting/blob/7c00862a8341ab8163e297552cb36627a260fccb/main.py#L17
**The fix**
The fix is to replace `torch_dytpe` wit…
-
### Feature request
https://github.com/huggingface/transformers/blob/0fdea8607d7e01eb0e38a1ebeb7feee30a22f0cf/src/transformers/modeling_attn_mask_utils.py#L332
Here we can just assume the user pro…
-
## Description:
Hello! I’ve been following the development of this repository and appreciate the efforts to benchmark various efficient Transformer variants. I’d like to propose the implementation of…
-
https://openreview.net/attachment?id=rkgNKkHtvB&name=original_pdf
openreview: https://openreview.net/forum?id=rkgNKkHtvB
google ai blog: https://ai.googleblog.com/2020/01/reformer-efficient-tran…