Open liveck opened 1 month ago
It seems that bipe_alibi is not working yet.
get_ape_embeddings returns a tuple, which is different from embed_tokens.
get_ape_embeddings
All codes since here do not work.
if self.config.rpe_type == "bipe_alibi": inputs_embeds = self.get_ape_embeddings(torch.stack([input_ids, token_ids], dim=-1)) else: inputs_embeds = self.embed_tokens(input_ids)
Hi, Sorry about the typo in get_ape_embeddings and thanks for pointing this out.
It should be return embed rather than return embed, X[:, :, 0].
return embed
return embed, X[:, :, 0]
It seems that bipe_alibi is not working yet.
get_ape_embeddings
returns a tuple, which is different from embed_tokens.All codes since here do not work.