Closed nfelnlp closed 3 years ago
seems like you search for embeddings
but i use transformer.word_embedding
https://github.com/rbtsbg/gxai/blob/c183d48a8c871ea38c9529a0d24017bcccb4f833/explain.py#L164
since XLNet / LIG was not part of the paper, I did not check this in depth, so please double check it
also keep in mind that i might have worked with another version and that this could be deprecated
Thanks! I think inserting a case distinction for XLNet will suffice. It's just that I think it's weird that XLNet seems to be the only model which does not have the embeddings attribute.
I did not find it then and word_embedding
is my best guess
sorry closed by accident
so, i investigated the xlnet code. the word embeddings layer does not contain positional and segment embeddings, but it appears to be what we suspect it to be.
returns the word embeddings
later, the word embeddings are forwarded with the other embeddings, i.e. positional, segment ...
Wow, that automated message above this is really neat! I don't really need to say anything more than that, huh. 😅
https://github.com/nfelnlp/thermostat/commit/6a9e8b2b4fc0f4e785c6c69458e3832e0be09f04
The following error was raised for "mnli-xlnet-lgxa":