HanxunH / CognitiveDistillation

[ICLR2023] Distilling Cognitive Backdoor Patterns within an Image
https://arxiv.org/abs/2301.10908
MIT License
31 stars 2 forks source link

Requirements for the environment #2

Closed ShaniaShan closed 9 months ago

ShaniaShan commented 9 months ago

Thanks for sharing your wonderful work. When I tried to run the code, it threw errors due to dependency conflicts of packages' versions: omegaconf, fairseq, hydra-core, and mlconfig. Could you please provide the requirements.txt for setting up the environment? Thanks.

HanxunH commented 9 months ago

Hi,

Thanks for your interest in our work.

If you are interested in using the Cognitive Distillation, it can be used as a standalone tool (with PyTorch as a dependency). https://github.com/HanxunH/CognitiveDistillation/blob/main/detection/cognitive_distillation.py

If you are interested in reproducing the results from the paper, mlconfig==0.1.0 is needed along with other packages stated here: https://github.com/HanxunH/CognitiveDistillation/blob/main/requirements.txt

Thanks for sharing your wonderful work. When I tried to run the code, it threw errors due to dependency conflicts of packages' versions: omegaconf, fairseq, hydra-core, and mlconfig. Could you please provide the requirements.txt for setting up the environment? Thanks.

I'm not sure where the conflict came from. I did not use any except for the mlconfig, but if it is the mlconfig, feel free to remove it and use hard code or argparse to configure the experiments.

ShaniaShan commented 9 months ago

Thanks for sharing your wonderful work. When I tried to run the code, it threw errors due to dependency conflicts of packages' versions: omegaconf, fairseq, hydra-core, and mlconfig. Could you please provide the requirements.txt for setting up the environment? Thanks.