LiberAI / NSpM

🤖 Neural SPARQL Machines for Knowledge Graph Question Answering.
http://aksw.org/Projects/NeuralSPARQLMachines
MIT License
222 stars 86 forks source link

Attention for LSTM #1

Closed D0nPiano closed 2 years ago

D0nPiano commented 6 years ago

Hello, I really like your work on using seq2seq for creating SPARQL queries - just one question: Was there a specific reason not to include attention while training? As far as I understood the tensorflow NMT guilde, you would have to add something like --attention=scaled_luong to the options in your train.sh. Did you evaluate whether it works better with/without attention? Greetings!

mommi84 commented 6 years ago

Hi @D0nPiano and thanks for your interest!

As ours is a work in progress, we haven't activated yet the attention mechanism, but we definitely plan to evaluate that. Thanks for pointing this out, though.

Cheers!

mommi84 commented 2 years ago

Solved in PR #50.