Improbable-AI / curiosity_redteam

Official implementation of ICLR'24 paper, "Curiosity-driven Red Teaming for Large Language Models" (https://openreview.net/pdf?id=4KqkizXgXU)
MIT License
59 stars 9 forks source link

Which version of trlx and transformers are you using? #7

Open PamKing7 opened 2 months ago

PamKing7 commented 2 months ago

No matter whether I load the local model or the gpt2-imdb model from huggingface, the following error is reported: ValueError: GPTModelBranch does not support an attention implementation through torch.nn.functional.scaled_dot_product_attention yet. Please request the support for this architecture: https://github.com/huggingface/transformers/issues/28005. If you believe this error is a bug, please open an issue in Transformers GitHub repository and load your model with the argumentattn_implementation="eager"meanwhile. Example:model = AutoModel.from_pretrained("openai/whisper-tiny", attn_implementation="eager")

This seems to be a problem caused by the version of transformers, but my version has been updated to the latest version.Which version of trlx and transformers are you using?

PamKing7 commented 2 months ago

In addition, it appears that the TRLX module used for training does not support the MistralForCausalLM model.

nuwuxian commented 2 months ago

I have met the same problem.

PamKing7 commented 2 months ago

I have met the same problem.

This may be a version problem, please return the version of the transformers.

williamd4112 commented 2 months ago
PamKing7 commented 1 month ago
  • trlx: i'm using a customized version. See custom_trlx
  • transformers: I'm using the version in custom_trlx transformers==4.32.0

transformers 4.41.2

zui-jiang commented 1 month ago
  • trlx: i'm using a customized version. See custom_trlx
  • transformers: I'm using the version in custom_trlx transformers==4.32.0

how about the version of sentence_transformers

alexsting commented 1 week ago

I really admire you for completing a great job, however, I also encountered the same problem when I used colab to run your project. I have solved the problem after modifying transformers and sentence-transformer versions, but I have encountered new problems: RuntimeError: ffi_prep_cif_var failed, Could you please tell me all your environment versions?thank you!

alexsting commented 1 week ago

(! pip show freeze) would be helpful,thx!