Closed nicofirst1 closed 3 years ago
Hi @nicofirst1 , thanks for spotting this!
The best hacky solution for you for now would be to pass _maxlen+1 as a parameter here.
Please see loosely related issues #137 and #138
The reason we add a zero is that zero is by convention an EOS symbol in EGG. If a zero was already generated by the model it will be spotted here, otherwise, for consistency across games and model we make sure that an EOS is always produced in each message, either in the last position or before.
P.S.: You are using a custom branch of EGG (currently no **kwargs in the master branch), just to be sure, can you please try if it happens even with the master branch? I am almost positive that it does but just to be sure that your changes did not modify anything else
You are using a custom branch of EGG (currently no **kwargs in the master branch), just to be sure, can you please try if it happens even with the master branch?
I removed the kwargs and still got the same error.
The best hacky solution for you for now would be to pass max_len+1 as a parameter here.
Are you planning to integrate this fix?
It will be fixed but no ETA yet. I wouldn't change that line adding max_len+1
since it'd be better to centralize the handling of max_len, maybe adding a +1
in util.py
when parsing command line args. This would require checking it doesn't break anything else. Feel free to work on it if you like otherwise we'll fix it as soon as we can
Fixed in #219
Expected Behavior
No errors
Detailed Description
When using the TransformerSenderReinforce with SinusoidalPositionEmbedding the training raises a runtime error:
Where 11 is my max_len. This is due to the additional dimension concatenated here which brings
x
to be of dimension 12 whilet
is still of dimensionmax_len=11
.What is the reason for that additional zero at the end of the sequence?