Open Jaeihk opened 2 months ago
In this paper, it says k=8 tokens for random embedding, but when I actually checked embeddings.pt , it's [4,1280], is k=4 correct?
embedding_dict = torch.load("./embeddings_gs-299999.pt") Shape: torch.Size([4, 1280])
The provided embedding.pt is correct, since the length only has a very tiny influence on results. You can just use it.
Thank you for your answer. Can you upload the code for calculating IS score? Thank you.
In this paper, it says k=8 tokens for random embedding, but when I actually checked embeddings.pt , it's [4,1280], is k=4 correct?
embedding_dict = torch.load("./embeddings_gs-299999.pt") Shape: torch.Size([4, 1280])