-
## 🚀 Feature
Restructure the function `multi_head_attention_forward` in [nn.functional](https://github.com/pytorch/pytorch/blob/23b2fba79a6d2baadbb528b58ce6adb0ea929976/torch/nn/functional.py#L357…
-
### 🚀 The feature, motivation and pitch
I want to implement tree attention for vllm mentioned in [RoadMap](https://github.com/vllm-project/vllm/issues/3861). But I don’t know whether I should imple…
-
Post your questions here about: “[Text Learning with Sequences](https://docs.google.com/document/d/1vHoYMFH-53UpE528xv_-xhSrkjUELI7ihfXmz3J_As4/edit?usp=sharing)”, “[Text Learning with Attention](http…
-
Abstract: We present Compositional Attention Networks, a novel fully differentiable neural network architecture, designed to facilitate explicit and expressive reasoning. While many types of neural ne…
-
#229 and #304 suggest more attention should be paid to resource acquisition and release mechanisms. #304 is confirmed and #229 is a bit inconclusive. #294 suggests there is room for improvement as wel…
-
Some ideas floated in our meeting today:
- Reading on speech models
- Reading on hip-hop, lyrics, and language
- Reading on attention mechanisms in deep learning
- Workshopping figures or code i…
-
I'm trying to extend a project which implements a GNN using Battaglia et al.'s definition through the `MetaLayer` class.
I would like to include some attention mechanisms as defined [here](https://g…
-
https://github.com/icdevs/ICEventsWG/issues/41#issuecomment-2206509473
-
Docs only talked about testnet for a long time. Docs written after launch started talking about mainnet too, and now we have a mix.
All pages should be reviewed to make sure mainnet is referred too…
-
### What happened?
Every time I access my linux dashboard, it's telling me to migrate from Angular, which I do. Over and over and over and over again
### What did you expect to happen?
I expe…