Closed alejandromunoznavarro closed 1 week ago
There is an issue in hard_attention.py. get_decoder is missing the parameter embeddings when it calls modules.lstm.ContextHardAttentionLSTMDecoder (lines 413 and 433). This could be fixed adding the line embeddings=self.embeddings in both calls.
hard_attention.py
get_decoder
embeddings
modules.lstm.ContextHardAttentionLSTMDecoder
embeddings=self.embeddings
Thanks for the report, should be an easy fix. I'll take a stab at it.
I believe I have taken care of this but let me know if anything lingers.
There is an issue in
hard_attention.py
.get_decoder
is missing the parameterembeddings
when it callsmodules.lstm.ContextHardAttentionLSTMDecoder
(lines 413 and 433). This could be fixed adding the lineembeddings=self.embeddings
in both calls.