I'm very confused about this file, which is supposed to be lstm-attention based on the filename, but in main.py it's used as a new way to embed words. In your repertoire, your model uses meta.py, but in your paper, the attention mechanism is described. I want to know why, what is the function of the lstmat.py file, and I want to know where in the code are the formulae 1-3 in the paper. If you have time to answer my questions, I would be very grateful.
I'm very confused about this file, which is supposed to be lstm-attention based on the filename, but in main.py it's used as a new way to embed words. In your repertoire, your model uses meta.py, but in your paper, the attention mechanism is described. I want to know why, what is the function of the lstmat.py file, and I want to know where in the code are the formulae 1-3 in the paper. If you have time to answer my questions, I would be very grateful.