Closed VedaRePowered closed 10 months ago
Oh shoot, sorry. It looks like your version has EXPAND=True. Set that to False in the code.
Explanation: EXPAND was a debug mode that converts from quant to full. I'll document that file.
I'll make that a command line argument.
Hi thanks for the quick reply, I double checked the version of export.py that I have cloned and it does have EXPAND = False, additionally the line the error is on gets run regardless of that variable (the comment indicates it's needed for writing the file header)
Sorry if I'm missing something, and thanks again for the help
Can you confirm that https://github.com/srush/llama2.rs/pull/18 fixed your issue?
Yes #18 has resolved my issue exporting the model, thanks again for all your help!
I am trying to run
export.py
, but I'm running into the following error:I believe this is related to having the incorrect version of some library, but I can't determine the correct version (I'm not really much of a python person), my versions are:
I would suggest either updating the readme to include the exact versions to use, or adding a requirements.txt.