andyzoujm / representation-engineering

Representation Engineering: A Top-Down Approach to AI Transparency
https://www.ai-transparency.org/
MIT License
716 stars 86 forks source link

Assert error "NaN in output logprobs" #12

Closed chenlidar closed 1 year ago

chenlidar commented 1 year ago

Hi! When I run examples/honesty/honesty_control_TQA.ipynb with llama2-7b-chat-hf, it threw an assert error in line 44: assert np.isnan(output_logprobs).sum() == 0, "NaN in output logprobs" I already set layer_ids = np.arange(8, 32, 3). Wish to get your help.

andyzoujm commented 1 year ago

Might a quantization issue. You could try using float32 when loading the model by changing the torch_dtype argument.

justinphan3110 commented 1 year ago

changing to bfloat16 will also help. Related Issue huggingface/transformers/issues/25446

chenlidar commented 1 year ago

Thanks! Changing to bfloat16 really works.