-
![image](https://github.com/king-yyf/CMeKG_tools/assets/63271390/7147f29f-a0a6-4fc4-88b0-24af1e293d8b)
请问这是transformer版本库导致的问题吗
-
### Issue Description
The SHAP explanations do not sum up to the model's output! This is either because of a rounding error or because an operator in your computation graph was not fully supported. I…
-
Currently, the LSTM model remembers 'k' features (words) at each time step. To improve the model
- Attention mechanism can be utilized to select only the attentive features (words) from the encodi…
-
Hello, according to your tips and guidance, I did successfully run the Attention-LSTM part of the code. Can I ask you again, are the data used in the STA-LSTM part still TrainSet_us101.mat, TestSet_us…
-
Thank you for making your work and code publicly available. After executing the 'Feature_extraction.ipynb', I noticed that the features 'cycle' and 'IC' are missing. These are essential for the LSTM a…
-
I saw several keras discussion in making an attention mechanism, and i saw a line of code that used RepeatVector, is there any equivalent for it in tflearn? or is there any other way to achieve an att…
-
**Paper**
Data Augmentation for Low-Resource Neural Machine Translation
**Introduction**
This research focuses on the challenges faced by low-resource languages in the neural machine translation…
-
Please bear with me here.
This might be confusing to understand for some because I'm adding the pseudocode to support what's unclear to me. I've been following a tutorial and it was mentioned that …
-
Dear Yaseen,
thanks for your clean code.
As you know, there have the conceptions 'LSTM block' and 'LSTM cell'. But in a lot of LSTM example codes, including yours, there seems to be no attention was p…
-
Thanks for your codes. About the code, I have some questions. Dose the attention_net() function in the file lstm.py is completed? Or how can I apply the attention_net into loganomaly process. Thanks a…