When I used the model trained by the A2T attack to evaluate the robustness, the code used the other models to generate 1000 samples for evaluation.
Some of them, such as a2t and textflooer, can be run, but a2t-mlm and bae will get stuck at the normal point of generation. Stepping into debugging found that it was locked in the queue during parallel generation and caused the pause, but I don't know why it was locked, can you tell me why.
When I used the model trained by the A2T attack to evaluate the robustness, the code used the other models to generate 1000 samples for evaluation. Some of them, such as a2t and textflooer, can be run, but a2t-mlm and bae will get stuck at the normal point of generation. Stepping into debugging found that it was locked in the queue during parallel generation and caused the pause, but I don't know why it was locked, can you tell me why.