-
From [Algorithmic Simplicity](https://www.youtube.com/@algorithmicsimplicity):
- [x] [Why Does Diffusion Work Better than Auto-Regression? - YouTube](https://www.youtube.com/watch?v=zc5NTeJbk-k)
-…
-
Hey John! Here's the curriculum that I've worked on in the past. It's a bit less focused on language models as a sole topic, and more on modern ML from a broad perspective.
- Essential Concepts of …
-
## 집현전 중급반 스터디
- 2022년 7월 10일 일요일 9시
- 김은서님 발표
- 논문 링크: https://arxiv.org/abs/1710.10903
> ### Abstract
> We present graph attention networks (GATs), novel neural network architectures that ope…
-
## 一言でいうと
グラフの畳み込みを行う際にAttentionを導入したもの。各ノードを表すベクトルに対し共通重みWをかけて処理し、「ノードAにとってノードBがどれくらい重要か」を計算するために計算結果のベクトルをコンカチしてAttentionを計算する(計算は隣接ノード分のみ対象)。
![image](https://user-images.githubusercontent.co…
-
Venue: ICLR 2017
Summary: Proposes a new GCN architecture which utilizes an attention mechanism to learn a weight for neighbors of each node.
My opinion: This architecture is too complicated with …
-
### Please describe the purpose of the feature. Is it related to a problem?
I am inquiring about possibly integrating JAX-based Graph Neural Networks (GNNs) into MAVA for use in MARL. Many MARL algor…
-
# On the Global Self-attention Mechanism for Graph Convolutional Networks [[Wang+, 20](https://arxiv.org/abs/2010.10711)]
## Abstract
- Apply Global self-attention (GSA) to GCNs
- GSA allows GCNs…
-
请问能分享完整的代码吗?
-
Hi,
I have been getting errors looking like the one below when trying to export a model to ONNX within which I manually provide a `scale` argument to the scaled dot product attention calls:
```
…
-
* [ ] 1. [Hierarchical Pooling in Graph Neural Networks to Enhance Classification Performance in Large Datasets](https://www.mdpi.com/1424-8220/21/18/6070/htm)
* [ ] 2. [Hierarchical Graph Pooling wi…