Open yoshibenjo opened 1 year ago
Why do we do this? I assume this is removing the EOS token. I don't see this happening in the training code from Stanford. Thanks for your time <3
https://github.com/tloen/alpaca-lora/blob/e04897baaec39280fac97f1ad2bf33059b0df643/finetune.py#L104
@tloen
We want to remove the EOS token if the text continues beyond the CUTOFF_LEN. Note that the code pads and truncates at CUTOFF_LEN + 1.
Why do we do this? I assume this is removing the EOS token. I don't see this happening in the training code from Stanford. Thanks for your time <3
https://github.com/tloen/alpaca-lora/blob/e04897baaec39280fac97f1ad2bf33059b0df643/finetune.py#L104
@tloen