-
Hi, here are some questions puzzling me:
1. There is one file named "rel_1" in the dataset.zip. What information does it contain? Does it represent the aligned maps between the text-graph and visual-…
-
We've implemented a GCN which seems to be the first type of GNN.
Since then, there have been GraphSAGE and Graph Attention Network (GAT) architectures that build on top of it.
From a quick glan…
-
Hi,
i encounter the following error message trying to enable flash attention when running the command below. Can i know is flash attention supported ?
``command: ./main -m $model -n 128 --prompt …
-
Thanks for your impressive work.
In fact, I am trying to modify something about this structure, while I did not modify anything about attention mechanism. After I train my new model successfully, I…
-
Hi, good works!
I am very interested in the knowledge graph you mentioned, but you have only disclosed the relevant weight files. Could you share the processing code of the adjacency matrix and th…
-
Is there a way in which I could see the list of the currently registered custom ops ? No only the file which is used to register the operators in the the ORT session (`so.register_custom_ops_library(g…
-
### Describe the bug
I trained a GAT for binary node classification with the following structure
```
______________________________________________________________________________________…
-
Thanks for your excellent work,
According to your code , I want to know if the X_data in the same batch share the same learned graph ?
-
Most of Sage's references are in the master bibliography file, `src/doc/en/reference/references/index.rst`, but a few directories need attention.
- combinat (#28105, #28445)
- graphs (#28084, #2844…
-
I was looking at your implementation of attention here :
https://github.com/cvignac/DiGress/blob/main/src/models/transformer_model.py#L158
I have some question about the code :
```python
Q = Q.…
Forbu updated
7 months ago