Closed ghost closed 4 years ago
Thank you for letting me know.
These lines don't have any meanings because they were added for debugging. I only wanted to check that Multi-head-attention can run if size of query are different from size of key and value. At least, my local project doesn't have those lines, so I don't know why they still remain. Therefore they actually have to be deleted.
I'll update soon.
Thank you for the update!
Hello,
thank you for providing such an interesting project! I have a question about the following lines: https://github.com/reppy4620/Dialog/blob/b33fa575cb801a81704bac0fdaeb209d189f54e0/nn/model/encoder_decoder.py#L52-L53
I do not understand why the two source embeddings are concatenated. What does this implementation mean?
thanks!