-
### 🚀 The feature, motivation and pitch
I am workning on 4D attention mask input and LLM generateion process. Huggingface provides an interface for the 4D attention mask. Does vllm have any plan? htt…
-
Downloading specific types of masks is perhaps erroneous within this library, or they can be generated based on data types. Need to evaluate this as they don't just apply to SIC and the interface to f…
-
### Feature request
In the function torch_call of DataCollatorForCompletionOnlyLM, the suggested new feature can support correct masking on user requests even if the user and assistant messages are…
-
Right now the user needs to know to do this. Treating this as a bug since it will likely end up as one for everyone but me. I'll do this as soon as I can
-
There is some inconsistency regarding inverse mask, mentioned long time ago [here](https://github.com/thorvg/thorvg/issues/209#issuecomment-816010755)
For ADD, please see a comment for SUBTRACT
Th…
-
include procedure to use an external mask from HI/Ha where we know there is emission
-
hi, I have attention_mask problem mismatch in the cross attenstion
can you please explain this line:
requires_attention_mask = "encoder_outputs" not in model_kwargs ?
why is comed after this:
…
-
Is any plan to add attention masking support? PyTorch's version of flash attention v1 included the ability to provide an attention mask in their [implementation](https://pytorch.org/docs/stable/genera…
-
As is well known, there are a large number of empty features in the BEV Feature map, especially in lidar bev feature map. Have you conducted any relevant experiments to mask out the empty features whe…
-
### Search before asking
- [X] I have searched the Ultralytics YOLO [issues](https://github.com/ultralytics/ultralytics/issues) and [discussions](https://github.com/ultralytics/ultralytics/discussi…