-
I was looking at your implementation of attention here :
https://github.com/cvignac/DiGress/blob/main/src/models/transformer_model.py#L158
I have some question about the code :
```python
Q = Q.…
Forbu updated
7 months ago
-
Most of Sage's references are in the master bibliography file, `src/doc/en/reference/references/index.rst`, but a few directories need attention.
- combinat (#28105, #28445)
- graphs (#28084, #2844…
-
I think it would be useful to have the new edge attention graph convolution described in this [paper](http://openaccess.thecvf.com/content_CVPR_2019/papers/Gong_Exploiting_Edge_Features_for_Graph_Neur…
-
Ensure that nn.MHA is supported by FX graph mode quantization. Since this is a complex nn module, special handling might be required.
cc @jerryzh168 @jianyuh @jamesr66a @vkuzo @jgong5 @Xia-Weiwe…
-
[Line no 226, graph_attention_learning.py, Watch Your Step] return tf.transpose(d_sum) * **GetNumNodes()** * 80, feed_dict
Why is an arbitrary scaling by the number of nodes done? I am not sure if it…
-
Hi mate,
I'm trying to reproduce experiment results using WS-DAN/Xception and I'm impressed by the implementation of the WS-DAN network.
However, in train-wsdan.py, when I try to iterate dataloa…
-
### vllm latest
I add some logger in /vllm/model_executor/models/llama.py ,I want to print the attention ,like that
if I start llm server,the error is
[rank0]: During handling of the above e…
-
First off -- AMAZING TTS!!!
I know I'm repeating several other issues that have been opened, but I've spent several days testing and code tweaking to try to resolve the issues I have found, and wan…
-
Hi @yuyuanhang, nice work!
Turn to k-NN graph construction, will u add a k-NN graph construction API to this proj?
Based on the paper, the k-NN graph construction is also in a divide-and-conquer f…
-
### Describe the feature request
It would be great to have the option to provide pre-optimised TensorRT engine plans to ORT.
### Describe scenario use case
Using TensorRT in standalone, e.g. trtex…