Open luohao123 opened 8 months ago
where are these monkey patches?
Hi @luohao123
We've implemented the flash attention and removed that monkey patch: https://github.com/haotian-liu/LLaVA/blob/main/llava/train/train_mem.py
Can you explain what are other monkey patches that have been implemented by Transformers? Thanks.
As for time now ,transformers had support most monkey patch in the code, consider suppor tlatest transformers version make it simpler?