jxmorris12 / vec2text

utilities for decoding deep representations (like sentence embeddings) back to text
Other
727 stars 83 forks source link

why need to revise attention_mask #64

Open miaodog opened 2 months ago

miaodog commented 2 months ago

https://github.com/jxmorris12/vec2text/blob/master/vec2text/models/inversion_from_logits.py#L138

if we use frozen_embeddings, why not to use the input variable: attention_mask directly, but create a new attention_mask? is there any concern?